Download as pdf or txt
Download as pdf or txt
You are on page 1of 431

E. Wentzel, L.

Ovcharov

Applied Problems in
PROBABILITY
THEORY

MIR PUBLISHERS MOSCOW


. Applied Problems In
ability ory
E. C BenT~enb, JI A OB11apoa
llpnua~nLie aa~aqn
no Teopuo eepoaTuocTeu
lfocxsa Pa,nE o D CBJtaJ.. •
Appli d Problems In
• •
r o a 11 eo

E.Wentzel and L.Ovcharov

Tra nsl ate d from Russian


by Irene Aleksanova

Mir Publishers Moscow



F1rst pub],sh&d j986
R~vtsed trom the 1983 Russtan edltlon

© ll3JJ.aTen~cTBO «Panna u Cs.stabt 1983


© English translation,. ~br Pubhsh ers, iQ86

AUTHORS' PREFACE

This book is based on many years of experience o~ teaching pr.obabilit y


theory and its applicatio ns at higher educatiOnal estabhsh~ents.
It contains many of the problems we omselves encounter ed m .our
research and consultati ve work. The problems are related to a vanety
of fields including electrical engineering, radio en~i.neering, dat~ trans-
mission, computers , informatio n systems, reliability of techmcal de-
vices, prev~ntive maintenan ce and repair, ac~uracy of apparatus , con-
sumer service, transport, and the health service. . . .
The text is divided into eleven chapters; each of winch begms w1th
a short theoretica l introducti on which is followed by relevant formu-
las.
The problems differ both in the fields of applicatio n and in difficulty.
At the beginning of each chapter the reader will find comparati vely
simple problems whose purpose is to help the reader grasp the funda-
mental concepts and acquire and consolidat e tlte experience of apply- ·
ing probabilis tic methods. Then follow more complicat ed applied prob-
lems, which can be solved only after the requisite theoreticallmowledg~
has been acquired and the necessary techniques mastered.
We have avoided the standard typical problems which can be
solved mechanica lly. Many problems may prove difficult for both be-
ginners and experience d readers alike (the problems we believe most
difficult are marked by an asterisk). In tbe interest of tbe reader most
of the problems have both answers and detailed solutions, and they are
given immediate ly after the problem rather than at the end of the book;
we wrote the book for a laborious and thoughful reader who will first
try to find his own answer. This structure is very convenien t and has
justified itself in another book we have written, The Theory of Probabilit y
(Nauka, Moscow, 1973), which bas been republishe d many times both
at home and abroad (some problems in that edition have been repeated
in this book).
We believe that statement s and detailed solutions of nontrivial prob-
lems wh.ich dem?nstrat~ certain o! tl~e techniques of problem solving
are part1cul~rly mterestmg . Our a1m 1s not just to solve a problem but
to use the s1mplest and most general technique. Some problems have
been given several different solutions. In many cases a method of solu-
tion used ha~ a general nature and can be applied in several fields.
We have pmd special attention to numerical characteri stics which
m~k.es it possible .to solve a number of problems with exception al sim-
phclty. The applied problems using the theory of Markov stochastic
processes has been given the greatest considerat ion.
The brief theoretica l sections which open each chapter do not usually
PreftJce

repeat 'vbat extst1ng textbooks on probabtltty theory present, but have


been based on new methods
Thus thxs book ts. In a certatn sense, 1ntermed1ate between a collec..
t1on of problems and a textbook on the theory of probabilities It should
be useful to a w1de variety of readers such as students and lecturers at
h1gber schools, engineers and research 'vorkers 'vho require a proba
btlislic approach to their 'vork Note that the detatled solut1ons and
the cons1deratton g1ven to problem solvtng techniques make this book
:::u1 table for 1ndependent study
'Ve w1sh to express our gratitude to B V Gnedenh.ot Academictan
of the Ukratntan Academy of Sctences, \vbo read the manuscript very
attentively and made a number of valuable remarks t and 1lso to
V S Pugacbev, Academtctan of the USSR Academy of St1ences, 'vhom
we consulted frequently \vben we \\or ked on the book and whose
methodtcal pr1nc1ples and notat1on we substantially followed
PREFACE TO THE ENGLISH EDITION

We wish to express our satisfaction at having the opportunity to bring


our techniques of solving applied problems in probability theory
to the notice of the English reader.
We wrote this book so that it could be used both as a study aid in
probability theory and as a collection of problems, of which are about
700.
The theoretical part at the beginning of each chapter and the method-
ical instructions for solving the various problems make it possible
to use the book as a study aid. The solutions of many of the problems
in the book are important both from an educational point of view and
because they can be used when investigating various applied engi-
neering problems.
The methodology and the notation in this book correspond, in the
main, to those used in V. S. Pugachev's book [6] which was recently
published in Great Britain.
When the book was being prepared for translation into English,,
a number of corrections and additions were made which improved the
content of the book. During this preparatory work, Assistant Professor
Danilov made an essential contribution, for which we want to express
our gratitude.
E. Wentzel, ~. Ovcharov
CONTL.NTS

Authors'Preface 5
Preface to the Enghsh Ed1t1on 7
Chapter I
FUnd 3l.aental Concepts o! Proba btlt t y Theory DU"ec t Caleu-
lat1on of Probahtlity 1n an Urn Aiodel 9
Chapter 2
A1ge bra of E veu ts. Rules I or Ad d1ng and AI ul t1 plyn1g Proba-
blhtJes 29

Chapter 3
The Total Pro~ab1hty Formula and Bayes's Theorem 65
Chaptel' 4
Dl~rete Random Var~ahles 88
Chapter 5
Continuous- and ~ftxed Random Variables 112
Chapter 6
8} stems ~f Random Variables (Rand om Vector.s) t 44
Chapter 7
Numertcal Charac tenst1cs of Func t1on.s of Random Vatlables i 66
Chapter 8
D 1~ tr1hu t tons of • unc t 1ons of Rand om Variableli;.. The L1rm t Theo
retns of Probabd1 ty Theory 217
Chapter 9
Random Functions 260

Chapter 10
Flov, s of Even~. ~farltov Stochas t lC Proces..~s 324
Chapter 11
Queueing Theory 363
Appendices 420
B1hl1ogra phy 432

CHAPTER 1

Fundamental Concepts
of Probability Theory.
Direct Calculation
of Probability in an Urn Model
1.0. The theory of probability is the mathematical study of r~nd_om phenomen~~
The concept of a random event {or simply event) is one o! the prmc1paf conc~pts m
probability theory. An event is the outcome of an expenment {or a trwl). Six dot&
appearing on the top face of a die, the failure of a dev~ce ~uring its service life, and
a distortion in a message transmitted over a commumcatwn channel are all exam-
ples of events. Each event has an associated quantity which characterizes how
likely its occurrence is; this is called the probability of the event.
There are several approaches to the concept of probability. T~e "classi:a~" a:p-
proach is to calculate the number of favourable outcomes of a trwl and divide 1t
by the total number of possible outcomes [see formula (1.0.6) below]. The f~equency
or statistical approach is based on the concept of the frequency of an event m a long
series of trials. The frequency of an event in a series of N trials is the ratio of the
number of trials in which it occurs to the total number of trials. There are random
events for which a stability of the frequencies is observed; with an increase in the num-
ber N of independent trials, the frequency of the event stabilizes, and tends to a
certain constant quantity, which is called the probability of an event.
The modern construction of probability theory is based on an axiomatic approach
using the fundamental concepts of set theory. This approach to probability theory
is known as the set-theoretical approach.
Here are the fundamental concepts of set theory.
A set is a collection of objects each of which is called an element of the set. A set
of students who study at a given school, the set of natural numbers which do not
exceed 100, a set of points on a plane lying within or on a circle with a unit radiu&
and centre at the origin are all examples of sets.
There are several ways of designating sets. It can be denoted either by one capi-
tal letter or by the enumeration of its elements given in braces or by indicating (al-
so in braces) a rule which associates an element with a set. For example, the set M
of natural numbers from 1 to 100 can be written
J1! = {1, 2, ... , 100} = {i is an integer; 1 ~ i~ 100}.
The set C of points on a plane which lie within or on a circle with centre at the ori-
gin can be written in the form C = {x2 +
y 2 ~ R 2 }, where x and y are the Cartesian-
coordinates of the point and R is the radius of the circle.
Depending on ho'': ma~y elements i~ has, a set may be finite or infinite. The set
M = {1, 2, ... , 100} 1s fimte and contams 100 elements (in a particular case a finite-
set can. c~nsis~ of only one element). The set of all natural numbers N = {1, 2, ... ,
n, ... } Js mfimte; the set of even numbers N 2 ={2, 4, .. . , 2n, ... } is alo::o infinite An
~nfin~te set is said to be _countable if all of its terms can be enumerated (both of the-
mfi~ute sets N ~nd 1V2 given above are countable). The set C of points within or on
a c1rcle of radms R > 0
C = {x2 +
y 2 ~ R} (1.0.1)·
is infmite and uncountable (its elements cannot be enumerated).
Two sets A and B coincide (or are equivalent) if they consist of the same ele-
ments (the coincidence of sets is expressed thus: A = B). For example the o::et of
ro_ots of the equation x 2 - 5x
With the set {4, 1}).
+
4 = 0 coincides with the set {1 4} '(and- also
' '
. The notation a E A means t~at an object a is an element of a set A; or, in other-
\\ords, a belongs to A. The notat10n a E A means that an object a is not an elem nt.
of a set A. e
to
7 •
Applied Problems tn Probablltl'{l Thtoru
I & 5 Est DIE E

An tm p ty set 1 s a set w1th no elem-ents It \ s des 'tg na ted 0 r. "tam. ple th-e set of
p01nts on a plane whoso coordtnate.9 (x Y) satisfy the 1nequahty .x2- +
y 1 ~ -1 1.!
an empty set {z'! + yl ~ -t} = eJ All entpty sets are equlvalent
A set B 1s said to be a subse-t of a set A tf all tho ele1nents oi B also belong to A.
The notation IS B £A {or A 2 B) Example_, ~t, 2t t 100} ~ {I, 2, •., tOOO}t
{t, 2 , 100} ~ (1 2 100} {x 2 + Y ::::;; 1} s:; {x' +
y'J G; 2}
An empty set 1s a subset of any set A 0 s; A
\Ve can use a geomettlcal1nterpretaban to depLct the. tncluston of sets the pomts
on the plane are elements of the set (see F1g 1 0 1 where B IS a subs at of A).

Ftg t 0 2

The union (logical sum) of the set~ A and B Js a set C ~ A +B which consists
oi all the elernents of A and all those of B (tncludtng those whu:.h belo,g to both A
and B) In .short~ a unton of two ~ets Is a col!ectlon of elements belongtng to dl
U:ast one of them Examples {t 2 100) +{50 51 200} =- {i 2 • 200}.
i 2 tOO}+ {1 2
1 fOOO} = {1 2 iOOO} {1 2 tOO} + 0 =
1 2 tOO} The un1on o[ the two sets A and. B 1s sh')\Vn tn F1g 1 0 2. the
5 aded area IS A +B .
We can suntlarly define a Union of any number of sets A1 +A 2 + +An=
n
~ A, 1s a set eonststtng of all the elements which belong to at least one of
i=i
the sets A1, , An \Ve can also cons1der a unlon of an Infinl te (countable)
00

number of sots 2J Al=A 1 +A~+ +An+ • Example {1. 2}+{2. 3J+


Pi
{3t 4l+ + {n. n-t}+ ;:;: {1~ 2 1 3~ ,. n }
The lnttrs~tfon (log1cal product) of two sets A and B IS the set D = A B that
coruilst!' of the elements whieh belong to both A and B Examples {I 2 t tOO) x

{SO 51. 200} = {50~ 51 iOO) {1 2 tOO} {1 2, , 1000} ==-


{t. 2~ ., 100}9 {tt 2 t iOO} 0 = 0 41 An Intersection of t'vo sets A and B
lS shown In } 1g 1 0 3
We ea n s tm Ilar l y de fine the 1n ters~c t ton o [ any number of sets T b. e. set A 1 A 1 ~ 41

n
An ::=:;; IJ A 1 consists of all the elefJlents '" h1ch belong to all the sets A 1 A 2 ~.
i-t
An. Slll1ul tan.eou.sl y
00
'The- d eftnl t 1on ean be extended. to an 1 nn n1t e {co un table)
number of sets IJA, 1s a set conQ.1st1ng of element.9 belonging to all the 5ets Au
t=l
A1 An s1multaneously
Two sets A and Bare .said to be dLSJalnt (non1ntersect1ng) 1f the1r 1ntersectton
1s an emptJ. set A B =::::; 0 1 e no element belongs to both A and B (F1g t 0 4).
F tgure 1 0 5 1llu~ t tates several d l.SJO 10.t sets
As tn the notation of an ord1nary mult1pltcation the SifPl 1s usually om1tted.
It tS ~uffic1ent to have tht:; elementary knowledge of set theory in order to U.!:!e
the set theoretical constructton of probabihty theory
Assnm e that an ex per1 ment (tn al) is co nd ue ted whos a result 1s not known be Iore-
band 1 e ts acc1dental Let us constder the set Q of all posszble outcomes of th&
Ch. 1. Funda menta l Conce pts of Proba bility Theor y 11

€Xper iment: each of its eleme nts OJ E Q (each outcom e) is known as an elemen tary
event and the whole set Q as the space of elemen tary events or the sampl e sp~ce. Any
~ubset of the set Q is known as an event (or random event); and any event A IS a sub-
~et of the set Q, viz. A £ Q.
Examp le: an experi ment involv es tossin g a die; the space of eleme ntary eve~ts
Q = {1, 2, 3, 4, 5, 6}; an event A is a:r;t even score; A.= {2, 4, 6); A.£ Q. ~n partl: -
ular, we can consid er the event Q (smce every set IS a subset of 1tself); 1t IS sa1d
to be a certain event (it must occur in every experi ment) . We can add the empty set
J?f to the whole space Q of eleme ntary events ; this set is also a.n event and is said
to:be an imposs ible event (it canno t occur as a result of an expen ment) . An examp le

A 8
A
B
B
ll

Fig. 1.0.3 Fig. 1.0.4 Fig. 1.0.5

of a certain event: {a score not exceed ing 6 when a die is tossed }; an examp le of an
impos sible event: {a score of 7 dots on a face of a die).
Note that there are differe nt ways of defini ng an eleme ntary event in the same
experi ment; say, in a rando m throw of a point on a plane, the positi on of the point
can be defined both by a pair of Cartes ian coordi nates (x, y) and by a pair of polar
coordi nates (p, <p).
Two mutua lly disjoin t events A and B (such that AB = 0) are said to be in-
compa tible; the occurr ence of one preclu des the occurr ence of the other. Sever al
events A 1 , A 2 , ••• , An are said to be pairwi se incom patible (or simpl y incom patibl e),
or two-by-two mutua lly exclusive events, if the occurr ence of one of them preclu des
the occurrence of each of the others .
n
We say that severa l events A 1 ,A 2 , ••• , An forma comple te group if2j Ai= Q, i.e.
i=i
if their sum is a certain event (in other words , if at least one of them is certai n to
occur as a result of an experi ment). Examp le: an exper iment consis ts in tossin g a
die, the events A= {1, 2}, B = {2, 3,.4}, and C = {4, 5, 6} form a compl ete
group; A + +
B C = {1, 2, 3, 4, 5, 6} = Q.
'~e now !ntrod uce axiom s of l?roba bility theory . Assum e that every event is
~ssoc1~ted With a numbe r called 1ts probab ility. The proba bility of dn event A
1s design ated as P (A)*). We requir e that the proba bilitie s c.f the events should
satisfy the follow ing axiom s.
I. The probab ility of an event A falls between zero and unity
0 ~ P (A)~ 1. (1.0.2)
II. Proba bility additi on rule: if A and B are mutua lly exclus ive, then
P (A +B) = P (A) P (B). + (1.0.3)
Axiom (1.0.3) can he immed iately genera lized to any finite numb er of events :
if A1, A2, • •• , An are mutua lly exclusive events, then
n n
P ( Lj Ai)= 2j P (Ai)· (1.0.4)
i=1 i=1

*) If an event (a set) is denote d by a verbal descri ption of its proper ties or


b~· a formu la of type (1.0.1), or by an enume ration of the eleme nts of the set rather
hr .a letter, we do not use parent heses
probab ihty, e.g. P {x2 + y?. < 2}.
t an but use braces when design ating the
12 Apphed Problemj in Probability Theory

III Probability add1t1on rule tor an 1nfln1te sequence of events f/ A 1 A 2 An


are mutually ~%cluan.~ etents thtn
00 .00

P( ~ Ai)= lj P(A~) (1 05)


i-l t-t
The a··uoms of probabd1t~ theory can be used to calculate the probability or any
events tthe subsets of Q) irom t h ~pro b~ b1l1 t.l ~so I the elem~n t ary e'Vc-nt ~ ( \ r thet e iS a
finite or countable number of them' It 1:3. not necessary to eonstde-r here the nays
of deterrnining the prohabllt t) of the £'lementary evl'"nts In practice they are found
e1tl er from a constderatton of the s~metry of the e'tper1ment (for a ~ymmetr1c dte,.
for 1nstance 1t 1s natural to assume the appearance of each face to be equ1pO'=qihle)
or <1n the baqts of e~pcrimental data (frequencies}
If th~ pocsJble outcomes of an expertment have symmetry then the probabd1
t1es c1n be d1r~ctly calculated lrom the so called urn model (model of e~ents) ThiS
techn1quc ts ha c:ed on the- a ~~urn pt1on that the elementary events are equlposslble
Several events A 1 A 2 An are iitaJ d to be equ l pass~ble tf they have the 8ame prob-
ability by v1rtue of the ~}mm.etry of the condttlOns of the e'tper1ment relative to
those events P (A 1) P (A 1) = - P (An}
If 1n an experiment ''e can repre~ent the sample space Q as a complete group
of dlSJOlnt and equipos~~ble events w- 1 w'i! w!\ then the events are called cases
(chances) and the expel unent 1s said to reduce to tne urn model
A case m1 IS ~aJ d to be jal o ura bl ~ to an event A Jf 1t IS an element of the set.
A wf E A
Stnce the caoe:.es w1 (J} 2 c.1n £orm a complete g1 oup of events 1t follo" s that
n
"51
~
WI=Q.
~~1

Since the elementary events m1 m2 il1n. are Incompatib1e 1t follo,\s from


the prohalHlity addition rule, that
n n
p ( ~ (!) l) -- p {Q) == }2 p ( w') = 1.
1=1 i=l
S1nce the elcmen tary events (1) 1 Ul ~ , Wn. are equ1pOss1ble th~1r probah1l1 ty
IS the same and 1s equal to tin
P (ool) = P {ro 2) = ~~. = P (Caln) ~ fin
This formula y1elds a .so-called clas.Hcal formula for the probabrl1ty of an ev~nt
if an expertment reduc-es to an urn model then th~ probQb,l!ty of tP,.ent A u1. that ezpe·
r&m ent can be cal,ulated by th~ forrtlula

PA (A)= mAin~ (1 0 6)
where mA IS the number of cases favourable to the event A and n 1s the total num...
her of cases
Formula (1 0 6) \\hlch \Va~ once accepted as the defin1t1on of prohabJlity ts
no\v "1th the mod~rn B'tlOmatic approarh a corollary of the probahihty addition
rule
Let us cons1der an e'tample Three \\hlte and four black balls are thoroughly
st1rred Jn an urn and a bt~llJs dra,\n at random Construct the ~ample syace for th1s
e.xp~r1ment and find the probah1hty of the event A ~ {a \Vhlte hal ts drawn}
'' e now label the halls 1 to 7 Jnc1uli:I ve the f r.sl three balls are Vt h Lte and the last
four are black Hence
g = { t' 2, 3, 4, 5, 6 7} I A ~ { 1' 2 3}
Since the exper1mrntal conditions ~re symmetr1e ''Ith reQ.pect to all the ba1Js (a
hall 1s dra'' n at random) the elementar) events are equ1pos~qble Stnce th(!y are
Ch. 1. Fundamental Concepts of Probability Theory 13

incompatible and form a complete group, the probability of event A can be found
by formula (1.0.6): P (A) = mAin= 3~7. . ..
When the experiment is symmetric w1th respect to the. ~o~s1b1hty of out~omes,
formula (1.0.6) makes it possible to calculate the probabihtres of events directly
from the conditions of the experiment.
A "geometrical" approach to t.he calcul~tion of the ~robabilities o~ ~v.ent~ is a
natural generalization and extens1on of a d1rect calculatton of probabthtws m the
urn model. It is employed when the sample space Q includes an uncountable set of
elementary events w E Q and by symmetry none of them is more likely than the

J 1

L
1\
I
0
I

'
a; •!/
'
2
D 1 2
Fig. 1.0.6 Fig. 1.0.7 Fig. 1.0.8

others as concerns. the possibility of occurrence*>. Assume that the sample s~ace
Q is a domain on a plane (Fig. 1.0.6) and that the elementary events w are pomts
within the domain. If the experiment has a symmetry of possible outcomps (say,
a "point" object is thrown at random in the interior of the domain), then all the
elementary events are "equal in rights" and it is natural to a~sume that the proba-
bilities that the elementary event (i) will fall in domains I and II of the same size
S are equal and the probability of any event A s; Q is equal to the ratio of the area
SA of domain A to the area of the whole domain Q;
(1.0. 7)

Formula (1.0. 7) is a generalization of the classical formula (1.0.6) to an uncount-


able set of elem.mtary events. The symmetry of the experimental conditions with
respect to its elementary outcomes w is usually formulated using the words "at
random". In essence this is equivalent to the random choice of 11 ball in an urn (see
above). In textbooks the probabilities calculated by this technique are often
called "geometrical probabilities". •
Assu.me, for instance, that two points with abscissas x and y are put at random
on the mterva~ from 0 to 2 (Fig. 1.0.7). Find the probability that the distance L
between.them ts less than unity. The elementary event (i) is characterized by a pair
of coordmates (x, y). The space of elementary events is a square with side 2 on the
x, Y plane. (Fig. 1.0.8). We then have L = I y - x 1 and so the event A = { I y -
xI < 1} 1s associated with the domain A which is hatched in Fig. 1.0.8.
P (A)= P {I y- x 1< 1} = SA/sa= 3/4.
If the space of elementary events is not plane but three-dimensional then we
must replace t~e are~s SA and Sg in formula (1.0. 7) by the volumes TT_.: and V~:h
and for a one-dimenstonal space, by the lencrths LA and Lo. of the corresponding seg-
ments of a straight line. o •

*) !Ve _do not say that the elementary events (i) are "equipossible"· we shall
ascertam m Chapter 5 that the probability of each of them is equal to zero.
14 Applif?d Problems in Probabiltty Thtory

Proble1ns and Exerczses


1.1 ~ F1nd out 'vbether the events lndtcated 10 each example form
a complete group of events for the given exper1ment (answer yes or
no)
(t) An expertment rnvolves tossing a cotn, the events are

A 1 ={beads}, A, = {ta1ls}
(2) An experiment Involves toc;.stng two cotns, th~ events are
B1 = {t,vo heads}w B 2 = {t,vo ta1ls)
(3) An experiment Involves t11rO\VIng two drce ~ the events are
C1 = {each dte comes up 6}
C2 = {none of the d1ce comes up 6},
C 3 = {one d1e comes up 6 and the other does not}.
(4) T'vo s1gnals are sent over a communication channel, the events
are
D 1 = {at least one s1gnal 1s not distorted},
D:J = {at least one s1gnal IS d1storted).
(5) Three messages are sent over a communtcat1on channel t the events
are
E 1 === ~the three messages are transmttted Without an error},
E 2 === {the three messages are transmttted \Vlth errors}~
E 3 = {tl\O messages are transmitted "v1th errors and one wrth-
out an error}
Answer~~
(1) yes, (2) no, {3) yes, (4) yes, (5) no
1 2 Regardtng each group of events say 'vhether they are Incam-
pattble 1n the g1ven expertment (yesl no)
(1) An experiment 1nvolves toss1ng a cotn the events are
A1 = {a head}, A, = {a tatl}.
(2) An e'-per1ment 1nvol ves toss1ng t'' o coinS, the events are
81 = {the first coin comes up heads},
B2 = {the second cotn comes up heads}.
(3) T,vo shots are fired at a target., the events al'e
Ca :={no htts}, C1 ={one h1t}, C 9 ={two h1ts}
(4) The same as above, the e' ents are
= {ona htt}, D1 == {one m1ss}.
Ch. 1. Fun dam enta l Con cep ts of Pro bab ility The ory 15

(5) Two cards are selected from a pac k; the eve nts are
E 1 = {bo th car ds are fro m bla ck sui ts} ;
E 2 = {th ere is a queen of clubs am ong the sel ect ed car ds} ;
E 3 = {th ere is an ace of spades am ong the sele cte d car ds} .
(6) Three messages are tra nsm itte d by rad io; the eve nts are
F 1 = {th ere is an erro r in the first message};
F 2 = {th ere is an error in the second me:Jsage};
F 3 = {th e first message con tain s an erro r and the sec ond
does not }.
Answer. (1) yes, (2) no, (3) yes, (4) no, (5) no, (6) no.
1.3. Reg ard ing eac h gro up of eve nts say wh eth er the y are equ ipo s-
sib le in the giv en exp erim ent (answer yes or no).
(1) An experi-.nent inv olv es tos sin g a coi n; the eve nts are
A 1 = {it comes up heads}; A 2 = {it comes up tail s}.
(2) An exp erim ent inv olv es tos sin g an unf air (concave) coi n; the eve nts
are as above, A 1 and A 2 •
(3) A sho t is fired at a tar get ; the eve nts are
B1 = {a hit }; B 2 = {a miss}.
(4) An exp erim ent inv olv es tos sin g two coins; the eve nts are
C1 = {two heads}; C2 = {two tail s}; C3 = {one hea d and
one tail }.
(5) A car d is sele cte d from a pac k at ran dom ; the eve nts are
D1 = {a hea rt} ; D 2 = {a dia mo nd} ;
D 3 = {a club}; D 4 = {a spade}.
(6) An exp erim ent inv olv es thr ow ing a die ; the eve nts are
E 1 = {th e score is no less tha n 3};
E 2 = {th e score is no more tha n 3}

(7) Three messages of the sam e len gth are sen t ove r a com mu nic atio n
channel und er the same con diti ons ; the eve nts are
F1 = {an error in the first message};
F2 = {an error in the second message};
F s = {an error in the thi rd message}.

Answer. (1) yes, (2) no, (3) no in a gen era l case ' (4) no ' (5) yes, (6)
no, (7) yes.
16 Apn lfed Pro bfrm s tn Pro bab ittty The orlf

t 4 Regard1ng e 1ch of tllO foli o\\ Jng gro up of eve nts s1y \Vhether
'the l for m a C(lm plete gro up 'vh eth er the y are 1nco m pat 1b le, \vh eth er
the \ are equ tpo ss.t ble , or 'vh eth er the } for m a gro up of cas es
(1) An expertmPnt lU\ ol, es to~slng a. (fntr) co1n, the events are
A 1 = {1t com es up heads}~ A, = {1t com es up ta1ls}

(2) 1\n exp~r1ment tnv ol ves tos~nng t\\ o cot ns, the eve nts are
8 1 = {t,vo hea ds} , 8 2 = {t,v o tail s},
8 3 ~ {one hea d and one ta1l}
(3) An exp er1 me nt 1n' ol' es thr o\\ 1ng a d1e the eve nts are
cl = {1 or 2} c2~ {2 or 3}, c:J ~ {3 or 4} .
c, ===== {4 or 5}, c5 ~ {5 or 6}.
(4) On e car d ts sele cte d fro m a pac k of 36 at ran dom t the eve nts nre
D 1 = {an ace} D 2 ~ {a .b.1og} D 3 = {a que en} ,
D 4 = {a kna ve) D 5 ~ {a ten} D 8 = {a ntn e},
D1 ~ {an e1ght}~ Ds ~ {a se, en} , Da = {a Slt: }.
(5) A sho t 1s fired at a tar get , the el ent s are
E 1 = {a hJt}, E 2 = {a m1~} ..
{6) Thr ee rnessa.ges of the sazne len gth are sen t und er the sam e eon·
d1 t 1ons the eve nts are
F 1 ~ {th e first message IS dts tor ted },
F2 =
{th e sec ond me ssa ge 1s dts tor ted }
F3 = {th e thi rd me ssa ge 1s dts tor ted )
(7) T'' o dev1ces are ope ra ted for a t1me ~, the eve nts are
G0 =
{none of the de-v1ces fatl ed}
Gl = {on e dev1ce falled and the oth er did not },
G1 = {bo th dev1ces fad ed} .
Ans,ver (1) yes yes yes,. yes {2) yes yes not no (3) yes nof yes~ no.
(4) yes , yes, yes t yes , (5) yes ~ yes , no. not (6) no no, l es, no,
(7) yes, yes , no, no
1 5 A co1n lS tossed unt 1l two hea ds or t'va tatl s are obt a1n ed tn
suc ces sto n Co nst ruc t the sam ple space for the exp er1 me nt and Iso late
the .sub~et cor res pon d1n g to the eve nt A =
{no mo re than thr ee tossrngs
Will h~ n~.c.es&n~y} Can the. p:rr>bab1l1ty of the eve nt be fou nd as the ra
tto of the num ber of ele me nta ry eve nts fa, our ahl e to A to the tot al
number of ele me nta ry eve nts and 1f not ,vh y?
Sol utio n. The e1e me nta ty e\e nts wh tch for m the set g (th e hea ds are
clenotcd by the lett er h' and the tail s hy "t") are ru 1 = {h h}, (1) 2 =
Ch. 1. Fundamental Concepts of Probability Theory 17

{t t} ffis = {h, t, t}, ffi 4 = {t, h, h}; ffis = {h, t, h, h}, ffis =
{t: h,' t, t} ... , the number of events is infinite but countable:
Q = {ffi 1 , ffi 2 , (1) 3 , • • • }. The subset of elementary events favourable
to A is A = {ffil, ffi2, ffis, ffi4}. • • •
It is impossible to find here the probab1hty of the event A as the ratio
of the number of elementary events favourable to A to the total number
of elementary events since the elementary events w1 , w2 , w3 , • • • are
not equipossible: each pair is less probable than its predecassor [see
problem 2:18 to find P (A)].
1.6. A polyhedron with /daces (k > 3) labelled 1, 2, ... , k is thrown
onto a plane at random and it falls one or another face down. Construct
the sample space for the experiment and isolate the subset correspond-
ing to the event A = {the polyhedron falls the face not exceeding the
number k/2 down}.
Solution. The space Q consists of k elementary events: Q = { 1, 2, ... ,
k}, where the numbers correspond to the number of faces. The
subset A consists of elementary eventsA = {1, 2, ... , [k/2]}, where
[k/21 is an integral part of k/2.
1.7. Under the conditions of the preceding problem, the polyhedron
is regular; the possible number of faces are k = 4 (a tetrahedron); k = 6
(a cube); k = 8 (an octahedron); k = 12 (a dodecahedron); k = 20
(an icosahedron). Find the probability of the event A for each polyhedron.
Solution. For a regular (symmetric) polyhedron the appearance of
each face is equi possible and, therefore, we can calculate P (A) by
·formula (1.0.6). Since the number of faces in each regular polyhedron
is even, it follows that [k/2] = k/2 and so for each of them P (A) = 1/2.
1.8. There are a white and b black balls in an urn. A ball is drawn from
the urn at random. Find the probability that the ball is white.
Answer. a/(a + b).
1.9. There are a white and b black balls in an urn. A ball is drawn
from the urn and put aside. The ball is white. Then one more ball is
drawn. Find the probability that the ball is also white.
+
Answer. (a - 1)/(a b - 1).
1.10. There are a white and b black balls in an urn. A ball is drawn
from the urn and put aside without noticing its colour. Then one more
ball is drawn. It is white. Find the probability that the first ball, which
was put aside, is also white.
Answer. (a - '1)/(a + b - 1).
1.11. An urn contains a white and b black balls. All the balls except
for one ball are dra-wn from it. Find the probability that the last ball
remaining in the urn is white.
+
Answer. a/(a b).
1.12. An urn contains a white and b black balls. All the balls
are drawn from it in succession. Find the probability that the second
drawn ball is white.
+
Answer. a!(a b).
1.13. There are a white and b black balls in an urn (a~ 2). Two balls . _
are dra\vn together. Find the probability that both balls are whit·-- -
2-05i 5
tS A ppl1ed Problems in Pro bab!l~ t g I heary

Soluhon An event A = {t\vo \Vhite balls} The total number of


outcow.es
n = C!+a = (a + b) (a + b - 1)1(1 2),

where C/:'= ml (:~m)\ Is the number of combinations of J.. elements


taken m at a t1me
The number of favourable outcomes
mA = C! = a (a - 1)/(1 2)
The probab1ltty of e'\ent A
P (A) = mAin = (a - 1)/[(a + b) (a + b - 1)]
1 14 There are a v.h1tc and b black balls In an urn (a> 2 b # 3)
F1ve balls are dta,vn together F1nd the probabiltty p that two of them
are whtte and three are black
Solubon The total number of cases
(«+b) {a.+b-1) ta+b-2) {«+b-3) (a+b-4)
:rt
n = Ca+b === t 2345

TJ1c number of fa,ourable outcomes


-
m-
ctc3-
b-0
a \tl-\) b \b-\) \b-~)
1 2 1 2 3 '
m tOa (a-t) b (b-1) (b-2)
P = n- =-(a+ b} (a+ b- i) (a+b-2) (a+ b- 3} (a+ b-4}

1 15 A batch cons1~ts of k arttcles l of" luch are faulty A total of r


arlrcles are taken from the batch for Inspection F1nd the probability p
that exactl} s of the selected artJcles are faulty
Ans\Ver p = c;c~::.iJC~
1 t 6 A dte 1s thro,vn once F1nd the probab•I1 t1es of the folio" Jng
events
A (the score 1S even},
=
B = {the score IS no less than 5}
C = {the score ts no more than 5}

Ans,~erP (A) = 1/2, P (B) =: 1/3, P (C) = 5/6


1 17. There are a whtte and b black balls 1n an urn (a> 2, b ~ 2)
T"o ba[Ls are drawn at once \Vhtch of the foll{)Wlng events ts mnre
ltkely
A = {the balls are of the same colour},
8 = {the balls aTe dtHerent 1n c~lout}1
Ch. 1. Fundamenta l Concepts of Probability Theory 19

Solution.
P(A)= c~~c~ a(a-1)+b (b-1)
= (a+b) (a+b-1) '
ca+b
c~c~ 2ab
p (B)= c2 (a+b) (a+b-1) ·
a+b
Comparing the numerator s of the fractions, we find that
P (A)< P (B) for a (a- 1) + b (b- 1) < 2ab,
i.e. (a - b) 2 < a + b;
P (A) = P (B) for (a- b) 2 = a + b,
P (A)> P (B) for (a- b) 2 >a + b.
1.18. Abo): contains n enumerate d articles. All the articles are taken
out one by one at random. Find the probabilit y that the numbers of
the selected balls are successive: 1, 2, ... , n.
Answer. 1/n!
1.19. The same box, as in the preceding problem, but each article
is drawn and its number is written down, after which it is replaced and
the articles are stirred. Find the probabilit y that a natural sequence
of numbers will be written: 1, 2, ... , n.
Answer. 1/nn.
1.20. Eighteen teams participat e in a basketbal l champion ship, out
of which two groups, each consisting of 9 teams, are formed at random.
Five of the teams are first class. Find the probabilit ies of the following
events:
A = {all the first class teams get into the same group};
B = {two first class teams get into one group and three into the other}.
Answer.
p (A)= 2qq3 _ 1 p (B)= C&Cl3+c~q 3 =!?
qs - 34 ' C~ 8 17 •

1.21. A certain Petrov buys a bingo ticket and marks 6 of the 49 num-
bers. Then the six winning numbers, out of the 49, are announced,
Find the probabilit ies of the following events:
Aa = {he guessed 3 out of 6 winning numbers};
A4 = {he guessed 4 out of 6 winning numbers};
A5 = {he guessed 5 out of 6 winning numbers};
As= {he guessed all the winning numbers}.
. Soh~tion. The problem is equivalen t to drawing 6 balls from an urn
m whtch there are 6 white balls (winning numbers) and 49 - 6 = 43
2*
20 Applied Problems tn Probablllty Theorf/

black balls (not 'vlnn1ng)


p (A,) = c~~J, ~ o 01765, p (A~)= c~~i, ~ 0 000969,
u
t
P (A5 ) = c~;:, ~ o 00001845, P (A 6 ) = c~,. ~ 0 715·10 7

1.22. N1ne cards


"'
labelled 0, 1, 2f 3, 4, 5, 6. 7 t 8 Two eards are
are
drawn at random and put on a table 1n a successive order, and then
the resulting number 1s read, say, 07 (seven), 14 (fourteen) and so on
F1nd the probab1l1ty that the number ts even
Solution<~ The- e\oenness of a number lS d~fi.ned by 1ts last. d1g1t wh.1ch
mu.st be even (zero 1s also an even number) The requtred probability
1s the probab1ltty that one of the numbers 0, 2 . 4,. 6., 8 Will be the second
to appear, 1 e 5/9
1 23 F1 ve cards are labelled 1 , 2, 3., 4, 5 T,vo cards are drawn one
alter another F1nd the probab1l1ty that the number on the second card
1s I arger than that on the first
Solutton~ The expertment has t\\ o posstble outcomes
A = {the second number ts larger than the first}~
B = {the second number IS smaller than the D.rst}.
S1nce the cond1t1ons of the expertment are symmetrrc wxth respect to A
and B, 've have P (A) = P (B) = 1/2
1 24. A box contatns arttcles of the same type manufactured at dif-
ferent 1actorles~ a of tbem ate manufactured a.t iactory 1, b at factory
II and cat factory II I All the arttcles are taken from the box one after
another and the factortes at 'vhtch they \Vere manufacture({ are 'vr1tten
down Find the probnbtltty that an article manufactured at factory
1 'v1ll appear before that manufactured at factory II
Solution It rs Irrelevant whether any of the articles are from fa c.
tory I II The probabtllty we seek ts the probability that an article man-
ufactured at factory I ts the first to be dra,vn, where a artteles are
from factory I and b articles are from factory II,. 1 e , a/(a + b)
1 25 There are two boxes conta1n1ng standard elements for replace·
ment The first bo '\: conta1ns a sound elements and b faulty ones, and
the second box contatns c sound and d faulty elements One element IS
drawn at random from each box F1nd the probab1l1ty that both ele..
ments are sound
Solution. Each element from the first box can be combtned wtth each
element from the second box, the number of cases n = (a +
b) (c + d)
The number of favourable cases m = ac and the probabtl1ty of the
event LS ac/[ (a + b) (c + d)]
1.26. Under the conditions oi problem 1 25 .find the probabllJty that
the two elements are dtfierent tn qual1ty
Answer. (ad +
bc)l[(a +
b) (c + d)J
1.27. Under the same cond1t1ons find the probabtltty that
both elements are faulty
Answer. bdl((a + b) (c +
d)J
Ch. 1. Fundamental Concepts of Probability Theory 21

1.28. A box contains k articles of the same type lab~lled 1, 2, ... , k.


A total of l articles are taken from the box in successiOn at rando~, t~e
number of each article taken out is written down and the article IS
replaced. Find the probability p that all the numbers written down are
different.
Solution. The number of cases n = k 1• The number of favourable
cases is equal to the number of arrangements of k elements taken l at
a time, i.e. m = k (k- 1) ... (k- l +
1). The probability of the
event
m k (k-1) ... (k-l+1) k!
P= n = tkl = kl (k-l)!

1.29. A child who cannot read plays with blocks labelled with the
letters of the alphabet. He takes five blocks from which the word "table"
is formed. Then he scatters them and puts them side by side in an
arbitrary fashion. Find the probability p that he will again form the
same word.
Answer. p = 1/5! = 1/120.
1.30. The same question for the word "papaya", a tropical Amer-
ican tree.
Solution. The number of outcomes n = 6!; however the number of
favourable outcomes is now not unity as in problem 1.29 but m = 3!2!
since the repeating letters "a" and "p" can be put in any order: p =
3!2!/6! = 1/60.
1.31. Several cards are selected from a complete pack of 52 cards.
How many cards must we select to state, with a probability greater
than 0.50, that cards of the same suit will be among them?
Solution. We designate A = {the presence of at least two cards of
the same suit among the selected k cards}.
For k = 2 we have n = c; 2 , rnA = Ci"·4; P (A) = 0.235 < 0.50.
For k = 3 we have n = C~ 2 , mA = C~ 3 ·4 +
C~ 3 C~ 0 ·4; P (A) =
0.602 > 0.50.
Thus we have to select k >3 cards.
1.32. N people take seats at random at a round table (N > 2). Find
the probability p that two fixed people A and B will sit side-by-side.
Solution. The total number of cases n = N! Let us calculate the
number of ~avoura~le cases m. There are two ways of seating two people
A and B Side-by-side and the other people can be seated in (N- 2)!
ways; ~hen .m = 2N (N- 2)! and p = 2N (N- 2)!/N! = 2/(N -1).
There IS a simpler way of solving the problem: let A sit where he pleases .
.Now there are N - 1 seats B can take, two of which are favourable;
hence p = 2/(N- '1) .
. 1.33. ~'he ~arne problem for a rectangular table with N people sit-
tmg arb1tranly along one side of the table.
Solut.ion. n = N!; the favourable cases divide into two groups:
(1) A s1ts at an end; (2) A does not sit at an end. The number in the
fn·st ml = 2 (N- 2)! and the number in the second m 2 = 2 (N - 2) x
(N- 2)!; P = (ml +m 2)1n = 2 (N- 1) (N- 2)!/.N! = 2/N.
22 -
1 34 There are J.f operators and N enumerated devices \\hicb they
can serviCe Each operator chooses a devtce at rD.ndom and \VIth equal
probah111ty but under the condition that none of the devtces can be
setvtced by more than one operator F1nd the probab1l1ty that the de-
vices labelled 1 2 AI \Vlll be chosen
Solut1on The number of 'vays of dtstrtbut1ng Jrf operators over N
d~v1ees 1S equal to the number ot permu.tat.1ons of N elements taken Llf
at a ttme v1z n = N (N- 1) (N--- lJ +
1) By the hypothesLS
all these ways are equ1poss1ble 1 e they form a group of cases The
numbe-r oi favourable cases (,vhen only the first .AI dev1ees aYe serv1t.ed)
ts m = 111 Hence the requ1red probab1l1ty
~/l t
p- N(N t) (J\o -Af+1J = C"JJ
1 35 There are A standard elements for replacement In a box among
v. htch K 1 elements are of the 1st type [( 1 elements of the lth
l1f,

type Km elements are of the mth type }J K 1 = K k elements


,:::: l
are taken from the bo.x at random F1nd the probab1l1ty that there Will
be kt elements of the 1st type kt elements of the ~th type , km
elements oi the mth type among them
Solution The total number of cases n ts equal to the number of 'vays
In whtch k elen1ents can be selected from K elements n = C1 The
number of favourable cases
m.
c~m =
m.
II
t=l
CAJ
gi
SinCe k1, element~ of the 1 qt type can bo chosen 1n C~1 ways km ele
h.
ments of the m th type can be cl1o~en 1n CKm wa1 s and' artous com
m
btnattons of them can be formed The reqUJreri probab1l1 ty
m
p= IJ C"i IC~
t=l K f

t 36 A local post office 1s to send four telegrarns there are four com
muntcat1on channels tn all The telegrams are diStribUted at random
over the communtcat1on channels each telegr1.m IS sent over any chan
nel 'v1th equal probability F1nd the probabtlJty of the event A =
{three telegrams are: sent over one channel 0 ne telegram over ano
ther channel and two channels remain empty}
Solut1on Tbe total number of cases n --- 4' The number of 'vays to
choose a channel over which three telegrams are sent C ~ 4 the num
her of wa:,. s to choose a channel over whtch one telegra~ IS sent C~ = 3
The number of ways to choose three telegrams out of four to send them
over one channel C! ~ 4 The total number of favourable cases mA =
4 34
P (A) = mAin = 4 3 4/44 = 3/16
Ch. 1. Fundamental Concepts of Probability Theory 23

1.37. M telegrams are distributed at random over N communica-


tion channels (1\' > M). Find the probability of an event
A = {no more than one telegram will be sent over each channel}.
Solution. The total number of cases is NM. The number of ways to
choose Jlf channels out of N in order to send one telegram over each
of them is Cl)i. The number of ways to choose one telegram out of M
and to send it over the first of the channels is CAt = 111. The number of
ways to choose the second telegram out of the remaining 111 - 1 is
q 1_1 = Jlt! - 1, and so on. The total number of favourable cases
mA = M (M- t) ... 1 = M!.
P (A) = Clfi·M!JNM.
1.38*. A local post office is to send M talegrams and to distribute
them at random over N communication channels. The channels are
enumerated. Find the probability that exactly k1 telegrams will be
sent over the 1st channel, k 2 telegrams over the 2nd channel, and so
N
on, k N telegrams over the Nth channel, with 2j ki = M.
i=l
Solution. The number of cases n = NM. Let us find the number of
favourable cases m. The number of ways to choose k 1 telegrams out of M is
C~}; the number of ways to choose k 2 telegrams out of theremaining M -k 1
. is C~f-n 1 , and so OIL The number of ways to choose kN telegrams out of
M- (k 1 + ... + k
kN_ 1) = kN is C;;~ = 1. These numbers must be mul-
tiplied together.

m= c"tck2
111 111-h 1 • • •
c"N-1
.lii-(ki+ ... +kN_ )• 1
1
_ M! (M-kt)! [M-(kl+k2+ ... +kN-2)]!
- kl (M -kl)! k2l [Jif -(k1 +!.:2)]! . . . kN- 1 lkNl
- Ml
k1lk2! ... kNl- N
.Ml .'
IT
i=1
kit

P (A)= min= M!/ (NM IT ki!) .


i=1
v'1.39*. Under the conditions 'bf problem 1.37 find the probability
that no telegrams will be sent over l 0 of N channels, one telegram will
be sent over l1 channels, and so on; and all M telegrams will be sent
over l;.r channels:
Zo + l1 + ... + Zu = N; 0 • l 0 + 1 • Z + ... + MZM =
1 Jlt!.
Solution. The total number of cases is n = NM. To find the number
of favourable cases m, we must multiply the number of ways in which
the channels can be chosen by the number of ways in which the tele-
24 Applit!d Problems zn Probabt llty Thtary

grams can be chosen The numbe r of ways to choose the channe ls ts


M
Nr/l 0 '1 1 r l,1 l) =Nil 0
lt.:O
lllJ

Let us find the numbe r of 'vays 1n 'vh1ch the telegra ms can be chosen
They fall 1nto a numbe r of groups tha 1n1t1al group (0 telegrams} lS
empty, the first group containS l1 telegra ms tn general kth group con
ta1.ns klh telegrams (k = t, 2 tl!) The numbe r of ways to choose
the groups of telegra ms 1s

~11 .AJl (t 39 1)
~~~~~~--~~--~~---
(1 11)f (2l 2 ) I (3!1 ) I (ftll ~t) 1- .&1
II (klk)l
l-1
Let us now find the numbe r Qf ways to choose the telegra ms from
the kth group so that k telegra ms are sent over each channe l This num
her of ways 1s
(Al 1J1f (kfkr

and the. numbe r of \vays tQ choose all the telegra ms ioT all the groups
ts equal to the produc t of these numbers for differe nt A
.AJ
II (kl,.)r
Jd) l).
ct 39 2)
k-1 {

j\Iult1plying (1 39 1) and (1 39 2) .. we get the numbe r of ways tn ,vhtch


the telegra ms can be chosen

.JI.f!

1\Iulttp lying th1s numbe r by the number of ways tn Vth1ch the channe l
can be chosen 've fi.nd the numbe r of favour able cases
N1 • Ml
m~ Af -M~--
II0 z,r II
'{;ma h.-..a\
<k'>l k
1.-lence tne proba.b i1lty of the event 've are rnteres ted 10

( 1 39 ~)
Ch. 1. Fun dam ent al Con cep ts of Pro bab ilit y The ory 25

1.4 0. Th ere is a po int im age of an ob jec t Jill on a cir cu lar rada_r scr.een
(Fig. 1.40) wh ich is po siti on ed at rand~m in the circle,_ no ~o~am Wi th-
bei ng mo re pro bab le (th e Im age of the obJ ~Ct _Is ~hr own _at
in the cir cle
-
ran dom " ont o the scr een ). We con sid er an eve nt A con s1s tm g m the dis
tan ce p from the po int !vi to the cen tre of the scr een bei ng sm all er tha n
r/2, i.e . A = {p < r/2}. Fin d the pro bab ilit y
of the eve nt.
So lut ion . Th e sam ple spa ce Q is the int eri or
of the cir cle of rad ius r. Th e do ma in A is hat che d
in Fig . 1.4 0.
SA nr2 /4 1
P( A) = S g - :n:r2 -4 ·
1.4 1. We hav e a ma gn eti c tap e 200 m lon g
wi th messages wr itte n on two tra cks . Th e me s- Fig. 1.40
sage on the first tra ck is Z1 = 30 m lon g and tha t
oth er tra ck is Z = 50 m lon g; the loc ati on of the rec ord s is un -
on the 2
kno wn . Because of a dam age , we had to rem ove a sec tio n of the tap e
l 0 = 10 m lon g wh ich beg ins 80 m from the sta rt. Fin d the pro bab ili-
tie s of the fol low ing eve nts :
A {ne ith er record is dam age d};
=
B = {th e first record is dam age d and the sec ond is no t};
C = {th e second rec ord is dam age d and the first is no t};
D = {bo th records are dam age d}.

So lut ion . Sin ce the loc ati on of nei the r rec ord is kn ow n, we inf er
tha t eac h ma y beg in any wh ere as lon g as the wh ole rec ord lie s on the
tap e. We des ign ate the abs cis sa of the beg inn ing of the firs t rec ord as x
and tha t of the second rec ord as y. Th e sam ple spa ce is a rec tan gle of
len gth L - l 1 = 170 m and of hei gh t L - l 2 = 150 m (Fig. 1.41).
In the figure the dif fer ent kin ds of hat chi ng des ign ate the do ma ins
cor res pon din g to damages to the first and sec ond rec ord s, and the let -
ter s A, B, C, D des ign ate the dom ain s cor res pon din g to the eve nts A
'
B, C, D (every do ma in, exc ept for D, con sis ts of . sep ara te par ts) .
Sn = 170 X 15 0= 25 500 mz,
SA = 50 X GO +8 0 X GO+ 30 x 50 + 80 x 30 = 11 700 mz,
Sn =4 0 x G0 +4 0 x 30 = 3600 m2,
Sc =5 0 X 60 +8 0 X 60 = 7800 m2,
SD = 60 X 40 = 2400 m2.
P (A )= 0.459, P (B )= 0.141,
P {C )= 0.30G, P (D )= 0.094.

. 1.4 2. Two sig nal s -r < 1/2 lon g are tra nsm itte d by rad io for a tim e
Int erv al (0, 1), eac h of the m beg inn ing at any mo me nt of the int erval
26 Appli~d Problem! vt Probabll ty Theory

(0 1 -- -r) and w1th equal probab 111\ y If tbe s1gna1s overl&p even par
ttally they both become d1storted and cannot ho recetved Ftnd the
probahill ty that the stgnal4:; will be re-ce1ved w1thout dtstort1on

TJ-e ftrst record


tS damasrd
~

Th"t setand rttb rti


lS damaged

D 1.711

Fig I 4.1

Solution '" e denote the moment ''hen the first s1gnal begins by x
and the moment when the second Signal begins by y The sample space
lS shown 1n F1g 1 42 The hatched doma1ns
y
A correspond to the event

/
/ A=== {the s1gnals are not distorted}=
/ ={l.x-gl>-rJ
P (A)= SA/Sa= (1-2"1')2/ (1- "t)2
IC' / 1 43 There are two parallel telephone
ltnes of length l (F1g 1 43a) d < l apart
It 1s kno,vn that there 1s a brealtn each of
o r them (the locat1on of each break ts un
1 T :r
known) F1nd the probab1l1 ty that the dtstance
F1g l 4Z R between the breaks 1s not la~ger than
a (d < a < l 2 + if) V
Solution. \\te denote the absctssa of the first break by x and that
of the second by y R = V 1x - y 1a + a~ The event we speak of IS
A = {1 x - y 12 + tP ~ a2} = {tx-- Y i ~ l' a 2 - £fl} The sample
.space 1s a square w1th s1de 1 So = 'P The doma1n A IS hatched 1n
F]g J 43b
SA= 2l Vaz-d 2
-a2 + d 2 ,
P(A)=-} Va2.-d2- (l';d2
Ch. 1. Fund amen tal Conc epts of Prob abili ty Theo ry 27

1.44 . A bar of unit leng th is brok en into thre e part s x, y, z (Fig. 1.f4.a).
Find the prob abili ty that a trian gle can be formed from the resu tmg
parts .
g
l

l
0 ' y
y I
d I
I v l
0 X VaZ- dZ
(b)
(a)
Fig. 1.43
Solu tion . The elem enta ry even t ro is char acte rized by two paran;te-
ters, x and y [since z = 1 - (x +
y)]. :~e depi ct the even t by a pom t
on the x, y plane (Fig. 1.44b). The cond ition s x > 0, y > _o, x -+: Y
are imposed on the quan titie s x and y; the sam ple space IS tfi~ mte rwr
1 <::
of a righ t trian gle with unit legs, i.e. Sg = 1/2. The conditiOn A re-
quir ing that a trian gle could be formed from the segments x, y, 1 -

!J y
1 1

!12
I
1

' 0 I X 0 1/2 t:c


0 X z
!! {a) {b) (o}

Fig. 1.44

(x + y) reduces to the following two conditions: (1) the sum of any


two sides is large r than the third side ; (2) the difference betw een any
two sides is sma ller than the third side. This cond ition is associated
with the trian gula r domain A (Fig. 144c) with area SA = (1/2) (1/4) =
1/8; p (A) = s A/Sg = 1/4.
1.45. Bufj on's needle prob lem. A plane is rule d with para llel strai ght
lines L dista nt from each othe r (Fig. L45 a). A needle (a line segment)
of leng th l < L is thrown at random on the plane. Find the prob abil ity
that it will hit one of the lines .
. Solution. We characterize the outcome of the expe rime nt (the posi-
tion of the needle) by two numbers: the abscissa x of the cent re of the
28 Applled Problem$ in Probabtl~ty Thtory

needle 'v1th respect to the nearest l1ne on the left and by the angle cp
the needle makes '' 1th the d1rect1on of the lines (Fig 1 45b) The fact
that the needle 1s thro~n on the plane at random means that all the
values of x and rp are equ1 po~s 1ble 'Ve can evidently l1m1 t ("\\ 1tbout
losing generalxty) the posstble 'alues of x to the Interval from 0 to L/2
and those of rp to the Inter' al from 0 to n/2t and consider the po.ss1bd
1ty of the needle h1tt1ng a I1ne for only one of the l1nes (the nearest left

.. L L L l
-. -

Q X l r
(a) (~) (t) 2
F1g I 45

ltne) The sample space Q lS a rectangle 'vtth s1des L/2 and n/2 (F1g 1 45c) t
Sc = Ln/4 The needle w1ll h1t the l1ne xf the absciSsa x of 1ts centre
1s smaller than (l/2) stn 'i'. the e\ ent \Ve are Interested In IS A = {x <
(l/2) stn <p} The domatn A IS hatched In F1g 1 45c t 1ts area 1~

l l
stn rp drp = ,
2 2

P(A)=~~=!~·
Rcntark Th1s formula "as ohta1ned by Buffon 1n the t8th century and "~s re
peatedly verltied exper1mentally for'' h1ch purpose the frequenc) of hit" l'\aS cal
culated for a long senes of thro" ~ The formula was even used to make an approxt
mate calculation of the number ~ and ~;:;atpzfaetory results '' ~re obtaJned
CHAPTER 2

Algebra of Events. Rules for Adding


and Multiplying Probabilities

2.0. The theory of proba bility is based on indire ct rather than direct metho ds of
calcul ating probab ilities when the proba bility of an event is expressed in terms of
the probab ilities of other related events. It is necessary, first of all, to know how to
express an event in terms of other events using the so-called algebra of events. We
now introduce the concepts of "a sum of events" and "a produ ct of events" and oper-
ations on them. Note that these concepts are only introd uced for events which are
subsets of the same space of eleme ntary events Q.
Since in our set-theoretical repres entatio n events are sets, the action s performed
on them (addition and multip licatio n) are defined as those corres pondin g to set op-
erations (see Chapt er 1).
Nevertheless, we shall repeat certai n definitions and give them a geome tric in-
terpre tation . We shall depict the space of eleme ntary events as a rectan gle and the

Fig. 2.0.1 Fig. 2.0.2

~vents as parts ?f !he ~ectangle (Fig. 2.0.1). The sum A +B of two events A and B
1s an ~v~nt c.ons1stmg m the occurrence of at least one of those events (the hatche d
domam m F1g. 2.0.1) . The sum of two events is eviden tly a union (sum) of the re-
n
spective sets. Simila rly, the sum of several events A 1 , A 2 , ••• ,An is the event LJ Ai =
i=1
A1 + A z + .. · + An consisting in an occurrence of at least one of them. We

can also form the union or sum of an infinite (countable) numbe r of events: 2J A i =
00

A1 +A !! ...L
I
An
• • •
...1-
I • •• • + •=1
~he produc t A ·B of two events A and B is the event consisting in a simult aneou s
realiza tion of the two events (the intersection of the sets, Fig. 2.0.2). The produc t
n
of se!.'eral events A 1 , A 2 , ... , An is an event l
A A ITA·=
A
1 2 .. • n
co ·
nsiS
t'
,mg
·
In a
·
Sl-
i 1
multnneous realiza tion of all the events (an interse ction of the respective sets).
We can also multip ly an infinite (countable) numbe r of events·. noo A.' = A 1 A 2 ....
An ... . i=1
30 A pplted Problems ln Probablllty Theary

It follows from the definitions of a '=Urn and o product of events that


A+A==A, AA=A,
A +
0 = o. A Q =At
A+ p =A~ A 0 ~ 0
If A ~ B,. then A + B = B., A B === A
T h~ opera\10ns o.i add1t 10.n :r:. nd m ul t 1p l1~a t ton Qf ev.e ut s have ao me o{ th.&
proper t1 es of ord 1nary add 1ti on and m ul t 1p1Jcat1on
t. Commutativtty
A+ B = B +A, A B ~ B A.
2 Associahvtty
(A +B)+ C = A + (B +C), (AB)C = A(BC)it
3 DI!trtbuhvity
A (B +C)= AB + AC.
All these propert1es follow from the fact that events ar& sets (sPe Chapter 1)

Fig 2 0 4 Fig. 2 0 5
-
An oppasfte'" or complementary event of A IS the evellt A of nonoccurrence of the
event A A :;: : : {A has not occurred} A com pIe menta ry event is represented 1n F 1g
-
2 0 3 ,..he domain A 1s the complement ol A v.tth respect to the complete spate 0.
It follo\\s from the defintt1on of a complementary event tha.t
-
- -
It 1s easy to verify (Fig 2 0 4) that 1f B £ A, then A ~ B It IS as easy to ver1fy
(F1 g 2 0 5) the foil owrng pro pert 1es of complementary events
A+B~A B, A B=A+B
The rules of the event algebra make 1t possible to comh1ne va:rtous stmple events
and form lD that v..ay other more complex events
The probub1hhes of compound events can he calculated from the probah1hties
of s 1m pier events, Using two has1 c rules (add 1t1 on and m ul t1 p l1 ca tton) of pro b a b ilt t y
theory These rules are oft en ca 11 ed the fun da. menta 1 t beorems of pro bah 1h t y theory,
hut 10 real1ty they are theorems only for the urn model and are Introduced ax1om
attcally for experiments wh1th do not reduce to the urn model
f ·flie aoaitl.on rule tor pro6alillihes Tlie pro6ablltly oftlie sum of two mutuallY
exclush.e events is equal to th! .rum of ~he probabilities of those events, 1 e If A B = .0,
then
P (A + B) = P {A) + P (B) (2 0 l)
Th1s rule was Introduced ax 1omat 1call y 1n Chapter 1 (axtom I I)
Ch. 2. Algebra of Events 31

The addition rule of probabilitie s can be easilY: gen~ralized to an arbitrary num-


ber n of incompatibl e events: if Ai ·A j = ,ef for ~ =/= J, then
n n
P ( ~ Ai) = ~ P (Ai)· (2.0.2)
1=1 i=i
The addition rule can be as easily generalized to the case of an infinite (countable}
number of events: if Ai-Ai = 0 fori=/= j, then
00 00

P ( ~ Ai)= ~ P (Ai)· (2.0.3}


i=1 i=i

(see axiom III in Chapter 1). . ..


It follows from the addition rule that if A 1 , A 2 , ... , An are dtsJomt ~vent~ th17;t
form a complete group, then the sum of their probabilities is equal to umty, I.e. 1f
n
2j Ai=Q, Ai·Aj= 0 for i =I= j,
i=1
then
n
~ P (Ai)=L (2.0.4)
i=1

In particular, since two opposite events A and A are mutually exclusive and form
a complete group, then the sum of their probabilities is equal to unity:
p (A) + p (A) = 1. (2.0.5)
To formulate the multiplicati on rule for probabilitie s, it is necessary to intro-
duce the concept of conditional probability. The conditional probability of an event A,
relative to the hypothesis that an event B occurs
[designated asP (A I B)] is the probability of the
event A calculated on the hypothesis that the event 2
B has occurred.
Advancing a hypothesis that the event B has
occurred is equivalent to changing the conditions A
of the experiment, i.e. retaining only those elemen- B = 528
tary events which are favourable to the event
B and removing all the other events. It follows
that instead of the space of elementary events Q we
have a new space Q 8 correspondi ng to the event B
(Fig. 2.0.6}. The domain AB, which corresponds
to the intersection of A and B, is favourable to p·Ig, 2 •0 •6
t h e event A on t h e h ypothesis that the event B
is realized.
2. The multiplicati on rule for probabilitie s. The probability of the product of two
cvcnt.s .A and B is e~ual to the probability of one of them (say, A) multiplied by the
condittonal probabihty of the other, provided that the first event has occurred:
p (AB} = p (A) . p (B I A), (2.0.6)
or, if we take B as the first event,
p (AB) = p (B)·P (A I B). (2.0.7)
The multiplicati on rule for probabilitie s can be generalized to an arbitrary number
of events: -
P (AlA:! ... An)= P (AI)· P (A~ I AI)· P (Aal A 1 A2) ... P (An I A1A2 ... An_ ).
1
(2.0.8)
.32 Applied Problemr fn Proba.bllUy :theory

Ff.lrmula (2 0 6) y1elds th~ follow1ng express1on for a eondttlonal pro ba h1l tty
P (B 1 A) = P (A B)IP (A), {2 0 9}

1 e the candlt,ona.l probabtlity of ant! event t on the hy pathesfs thal the oth~r tPJ~nt oecurJt
ts equal to the probabflfty of £he product of the twa events divided 6y the profJab£Uty
of tht t vent wh feh fs assum td to b' rea lt:ed
By analogy "1th formula (2 0 9} we can \Vflte
p {A l B) ;:;;: P (A B)IP (B} (2 D 10}

Two events A and B are said to he tndependen.t tf the occurrence of one of them
does not affect the prohab1hty of the oecurrence of the other
(2 0 ttl
or . "btch ts the same th1ng,
P (B J A) = P (B) (2 0 i2}

For t" o tnd epen dent e~en ts the rn ul t t pllca t 1on rule for pro h a b1l1t1 e.s a g.su me~
the form
P (A B) :== P (A} P (B) (2 0 13}

1 e the p robab tllty of the p 1-oduct oj tu.o tndtptndent er...-ents ts equal to the p rodu.ct of
tht p:robab t Uttes of those events
Several events are sa1d to be 1ndepende:n t tn the~r totall.ty (or s1mply 1ndepeo.dent)
d the- occurrence of any number of them does not affect the probabdtttes of the
other e\ents For <>everal1ndependent e'\ents the multtphcatlon rule (2 0 8) assumes
the form
(2 0 14)

1 e the probab tl:.ty of the produet of tndependent etent~ t~ equal to the product of the1r
pr-o babtl tt tes
Several trtals are SJld to he ~ndependen.t If the probabthty of the outcome of
-each of them does not dep~nd on the outeomes of tlie other trtals
The follow1ng theorem on a repet1tlon of a tr1al 1s a corollary of the add1t1on
.and mulhphcatton rules for probabthhes If n ~ndependent trials are ptrfarmed tn
each of wh:.ch an event A occurs w~lh probabt.lUy p the.n Jh.e prcbabUUy of the event A
occurrz.ng tx(ltl ly m hme$ fn the given experlmen t lS expressed by thr formula
Pm n = C~pm(t - p}n-m, (2 0 {5)
or, de.!ngnattng t - p = q.
Pm n-Cmpmqn-m
- t.t. (2 0 16)
The probabthty of the event A oecurr1ng not less than m tnnes 1n a sel'tes of
n 1nd e pe-ndent tr ta Is 1s ex ~ressed by the formula
n
R m, n ~ ~
~
ckp~tqn-~t
n , (2 0 17)
A=m
or ny the formula
m-1
Rm. n=t-- ~ C~phq1L-h (2 0 18)
11.~0

From the two formulas (2 0 i 7) and (2 0 18) \Ve choose the one whtch conta1ns
a. smaller number of te~
I
Ch. 2. Algebra of Events 33

Problems and Exercises


2.1. Can the sum of two events A and B coincide with their product?
Answer. It can if the event A is equivalent to the event B (As Band
B £ A). For example, if a message transmitted over a communication
channel can only be distorted by a noise on the time interval occupied
by the message and is surely distorted when there is a noise, then the
events
A = {the message is distorted},
B = {there is a noise on the time interval occupied by the message}
are equivalent: A = B; A + B =A = B; AB =A =B.

Fig. 2.2 Fig. 2.3

2.2. Prove that if two events A and B are compatible, then


P (A +B) = P (A) + P (B)- P (AB). (2.2.1)
Solution. Let us represent the event A + B as a sum of three incom-
- -
patible events: AB (A and not B); AB (B and not A) and AB (both A
and B) (see Fig. 2.2):
-
A +
B = AB +
AB AB.+
We fmd the expressions of the events A and B:
A = AB + AB; 8
-
= AB + AB.
Using the addition rule for probabilities, we find:
P (A + B) = P (AB) + P (AB) + P (AB). (2.2.2)
P (A) = P (AB) + P (AB),
P (B) =
-
P (AB) + P (AB).
Adding the two last expressions together, we gel
P (A ) + P (B) = P (AB) + P (AB) + 2P (AB).
Subtracting (2.2.2) from this equation, we get
P (A) +P (B) = P (A +B) + P (AB),
3-0515
App1led Prob1ems in Probabil ty Thecry

''hence follo,vs (2 2 1) From (2 2 1) 1 t follo,,s that


P (A +B)~ P (A)+ P (D)
a I" a}S and the equal1 ty sJgn can he put only for Incompat1bJe e" ents
2 3 \\ r1tc the e~press1on for the probabilltl of the sum of three JOint
events
Solution From F1g 2 3 \\ e obt 'lin
P (A + B + C) ::::: P (A) + P (B) + P (C) - P (AB) - P (A C)
- P {BC) + P (ABC)
2 4 An experiment cons1sts In tosstng t\\ o coins The following e" ents
are consadered
A - {the first co1n comes up heads} B ~ {the first coin tomes up
ta1ls}, C - {the second co1n comes up heads}
D = {the ~econ d cotn comes up taIls} E = { "\ t least one head)
F = {at least one tatl} G - {one head and one ta1l}
!I = {no heads) A = {t\\o heads}
Determtne the el ents to 'vl11ch the loll a'' 1ng el ents are eqUL\ alent
(1) A +C (2) AC (3) EF (4) G +E (5} GE (G) BD (7) E +K
Ans,\er (1) A + C - E (2) lC ~ h (3} AF ~ G (4) G +E ~ E
(5) GE - G (6) BD - II (7) E --- !( - £

2 5 Three messages are ~ont 10 succession O\cr a communication


channel each of\\ htch may either be sent correctly or be dtstorted
"'\Ve cons1der the folio,, tng e' t1n ts
,1 1 = {the l th mesq age \Vas c=.en t corret t I} }
A 1 ~ {the 1th mess1ge 1-~ as distorte-d} (l ~ j 2 3)
E't.pre's the follo,vtng events as sums products and sums of products
~

of tl e e\ents A 1 and A
1 ~
{the three messages "ere ~ent correct! y}
B - {the three mes~ ages ''ere dIstorted}
C = { nt le1.st one message 'vas sent correctly}
D = {at least one me~sage 'vas distorted}
E = {not less than t\VO mec::.sages ''ere se.nt. c.or.rer.tJ}}
F {not more than one message 'vas sent correctly}
:!;;;;:

G - {the fi.rst message sent correctly 'vas the tbtrd 1n the


sequence}
Ch. 2. Algebra of Events 35

Answe r.
---
A= A 1A 2A 3, B = A 1A 2A3, _ _
c = AtA2A3 + A1A2A3 + AtA2A3 + A1A2A3 + AtA2A3 +A tA2A3 + A!A2A~
D=A 1A;ft 3+A 1A2 A3+AtA 2.4; +AtA 2 Aa+~AzAa +AtAz A3 +AtAz Aa,
+
E = A 1A 2 A 3+ A 1A 2 /f; + AtAzA3 ~AzA3,
F = A1A2 A3+ A 1A 2 A3 + ~AzAa + AtAzAa,
G=A 1A 2Aa.
2 6 A crroup of fo·ur homocreneous objects is being tracked . Each of
the ·object~ can be either det:cte d or not detecte d. We conside r the fol-
lowing events:
A = {exactl y one of the four objects is detecte d};
B = {at least one object is detecte d};
C = {not less than two objects are detecte d} ;
D = {exactl y two objects are detecte d};
E = {exactl y three objects are detecte d};
F = {all four objects are detecte d}. -·
Find what the following events consist in: (1) A + B; (2) AB; (3)
B + C; (4) BC; (5) D + E + F; (6) BF. Are the events BF and CF
the same? Are the events BC and D the same?
Answer. (1) A + B = B; (2) AB =A; (3) B + C = B; (4) BC = C;
(5) D + E + F = C; (6) BF =F. The events BF and CF are the
same; BC and D are not the same.
2.7. Given below are experim ents and the events that can occur in
them. Indicat e their comple mentar y events.
(1) Two messages are sent over a commu nicatio n channe l: an event
A = {both messages are sent correct ly}.
(2) A ball is drawn from an urn which contain s two white, three black
and four red balls; an event B = {a white ball is drawn} .
(3) Five messages are sent; an event C = {not less than three mes-
sages are sent correct ly}.
(4) n shots are fired at a target; an event D = {at least one hit is
registered}.
(5) A preven tive inspect ion of an instrum ent is made, the instru-
ment consist ing of k units, each of which can be either put in order at
once or sent for repair. An event E = {none of the units is sent for
repair} .
(6) Two peoyle play chess; an event F = {the whites win}.
Answer. (1) A = {at least one message is distort ed}; (2) If= {a black
or a red ball is drawn} ; (3) C-= {not more than two messages are sent
s•
36 Appl ed Probltms in Probab llty Theory

correct 1} ) (4) D - {no h1 ts are reg1stere d} (5) E = {at least one nn1t
must be re pa 1red} (6) F - {the blacks '' 1n or the game ts dra\vn}
2 8 An e\ ent B 1s a spec1al ca'=e of an e' ent A B ~ A 1 e 1t fol
lo\\s from the occurrence of the e'ent B that the event A has occurred
Does tl folio" from B that ,~T has occurred?
Ans"er No 1t does not For Instance an experiment may consist
1n "=.end1ng t~ o messages an e\ ent A = {at least one message 1s d1:_
torted) and an e\ent B ={both messages are distorted) If an event JJ
bas occurred li - {less than t\\ o messages are dtstorted} 1t does not
-
follo~ that ne1ther of the messages IS dtstorted (e\ ent A) On the con
trar} B follo'' s from A (A ~ .li)
Ftgure 2 8 sbo,,s e' ents A and B B ~ A and the complementary
events A (, erttcal hat chtng) and if (horizontal hatching) It can be 1m
-
medtatel~ seen that A £; B
2 9 If an event B 1s a spectal case of an
event A (8 ~ A) are the events dependent?
,. .....
~
-- '\.
"'\ll ··r--
Answer They are dependent tf P (A) =F 1
l
A-f 8 1 stnce P ( t/B) - 1
/ 2 10 Are the folio" 1ng events dependent
7
~
....,. , (1) mutuall} dlsJotnt events (2) events whtch
form a complete group (3) equ1 possible
events?
\nswer (1) Dependent stnce the occur
F 5g 2 8 renee of any of them turns 1nto zero the
probabtltttes of all the others happentng
(2) dependent s1nce the nonoccurrence of all but one turns tnto untty
the pro ba b 111 t y of the last one (3) ma J be et ther dependent or tnde
pendent
2 11 An expertment consists tn a successtve to~stng of t'vo coinS
\Ve consider the follo,vtng events
A - {the first coin comes up heads} D = {at least one head comes
up} E - {at least one tatl comes up} F - {the second cotn comes up
heads} Determtne 'vhether the follo,vtng pntrs of events are de pen
dent (1) A and E (2) A and F (3) D and E (4) D and F Determine the
condltlonal and uncondtttonal probabll1t1es of the events In each patr
Ansv. er
(1) P (£) - 3/4 P (E 1A) - 1/2 the events are dependent
(2) P (A) - 1/2 P (A I F) = 1/2 the events are Independent
(3) P (D) = 3/4 P (D 1E) - 2/3 the events are dependent
(4) P (D) - 3/4 P (D I F) - 1 the events are dependent
2 12 An urn con tarns a \Vhtte and b black balls T'vo balls arc dra\Vfl
e1ther Simultaneously or one after another Ftnd the probabtltty that
both balls are 'vhtte*l
Th ~ problem hke a number or other problems 1n th 9...,chapter can be also
*)
50lved by a d1rect calculatton of the number of ca!es The potnt here is to solve
them by uslng the addition and mutt phcahon rules
Ch. 2. Algebra of Events 37

Answer. By the multiplication rule for probabilities


a a-1
P {both balls are white}= P {ww} = a+b a+b-1 •

2.13. There are a white and b black balls in an urn. A ball is drawn,
its colour is noted and the ball is replaced. Then one more ball is drawn.
Find the probability that both drawn balls are white.
Answer. [a/(a + b))2.
2.14. There are a white and b black balls in an urn. Two balls are
drawn at once. Find the probability that the balls are different in colour.
Solution. The event can be in -two incompatible variants: {wb}{or
{bw} by the addition and multiplication rules
a b b a 2ab
P {wb+ bw} = a+b a+b-1 + a+b a+b-1 = (a+b) (a+b-'1) •

2.15. The problem is the same as 2.14, but the balls are drawn one
after another and the first ball is replaced.
Answer. 2ab/(a + b) 2 •
2.16. There are a white and b black balls in an urn. All the balls are
drawn from the urn at random, one after another. Find the probability
that a white ball will be the second in the sequence.
Solution. We can find the probability of the event directly! (see
problem 1.12). The same result can be obtained by the addition and
multiplication rules, i.e. -
b }_ a a-1 . b a _ a
~.t;j
P{
ww + W . - a+b a+b-1 T a+b a+b-1 - a+b ' -

?.t7. There are a white, b black and c red balls in an urn. Three of
them are drawn at random. Find the probability that at least two of
them are of the same colour.
Solution. To find the probability of an event consisting in at least two
balls being the same colour, we pass to the complementary event A=
{all the balls are different in colour}.
-
P(A)=P{wbr+wrb+rbw+ ... }
~-------~--------~
6 combinations

Hence
P (A)= 1- P (A) = 1- 6abc
(a+b+c) (a+b+c-1) (a+b+c-2) '

. 2.18. A coin is tossed until t\vo heads Qf two tails appear in succes-
Sion (see problem "1.5). Find the probabilit)T of the event A = {not
more than three tosses are needed}.
Applud Problem1 ln Probablltty Theory

2.25_. In a batch of N articles 1/ art1cles are faulty '\"e take n art1cles


from tbe batch for Inspection If more than m art1cles 1n the batch are
faulty, we reJect the 'vhole batch Find the probabiltty that the batch
\Vtll be reJected
Solution. The event A = (the batch Is reJeCted} can be represented
as a sum

A=Am+-t+Am+t+

where A 1 = {there are L faulty art1cles among those hetng Inspected)

2 26 A box contains homogeneous artlcles of d1fferent qual1t1es,


a arttcles are of the best qua.ltty, b art1cles are of the first qual1ty and c
articles are of the second qual1ty (a~ 4,. b ~ 4 c >4) Four articles
are taken simultaneously and at randon1 from the box \\ 1thout regard
for the1r qualtty The folio,\ 1ng events are cons1dered
A = {at least one of the chosen art1cles 1s of the best qual1ty}t
B = {at least one of the chosen arttcles IS of the second qual1ty).
F1nd the probabiltty of the event C = ..-.1 + B
-
Sol u t1on P ass1ng to the com pl em en tary e' en t C = {there IS ne1 ther
A nor B} = {all the articles are of the first qualtty}, \Ve have
p c - b
( ) - a+b+c
b-1 b-2 b~3
a+b-t-c-1 a+b+c-2 a+b+t-3"
-
whence 1t follo\vs that P (C) = 1 - P (C)
2.27. Dur1ng one surveillance cycle of a radar un1t, trach.tng a targett
the target 1s detected \Vtth probab1l1ty p The detection of the target
1n each cycle 1s Independent of other cycles F1nd the probabtl1ty that
the target 'v1ll be detected 1n n C) cles
Ans,ver. 1 ~ (1 - p)n
2~28 There are m radar un1ts, each of \Vhich detects a target during
one survetllance cycle \Vlth probabtltty p (Independently of other cycles
and other un1ts) Durtng t1me T each un1t makes n cycles Ftnd the
probab1ltty of the follo,v1ng events
A = {the tar got 1s detected at least by one untt},
B = {the target 1s detected by all untts ).
Ans\\er~
P {A) = 1 - (1 - p)mli, P (B) = [1 - (1 - p)nlm
2.29. There 1s a group of k targets, each of \Vhtch, Independently of
the other targets, can be detected by a radar unit \\ 1th probability p.,
Ch. 2. Algebra of Events 4t

Each of m radar units tracks the targets ind~pendently of ot_her units.


Find the probability that not all the targets m the group Will be de-
tected. _
Solution. We pass to a complementary event A = {all the targets-
will be detected}:
P (A) = [1 - (1 - p)m]h; P (A) = 1 - (1 - (1 - p)m]h.
2.30. A total of k workers take part in succession in the manufact~re
of an article· when an article is transferred to the next worker, Its-
quality is ndt inspected. The first worker ma_y_ spoil the article w_ith
probability p 1 , the second worker with probab1hty p 2 , and so on. Fmd
the probability that a faulty article will be manufactured.
Answer.
k
1-\1
i=t
(1-pi).

2.31. In a certain lottery n tickets were sold, l of which were winning


tickets. A certain Petrov buys k tickets. Find the probability that he
will receive at least one prize.
Answer.
n-l n-l-1
1- - - - - . , - - n-l-k+l --1- (n-l)l (n-k)!
I! n-1 ··· n-k+ 1 n! (n-l-k)! ·

2.32. Two balls are distributed at random and independently of one


another among four cells which lie in a straight line. Each ball can
fall in each cell with the same probability of 1/4. Find the probability
that the balls will fall in adjacent cells.
Solution. We divide the event A = {the balls fall in adjacent cells}
into a sum of as many variants as there are pairs of adjacent cells we
can form; we get A = A 1 + A 2 + A 3 , where
A1 ={the balls fall in the first and the second cell};
A 2 = {the balls fall in the second and the third cell};
A 3 = {the balls fall in the third and the fourth cell}.
The probability of each variant is the same and equal to
1 1 1 3
-4 · -
4 · 2 -- -8 ' P (A) = -8 •
2.33. k balls are distributed at random and independently of one
anothe1: ~mong n cells '~hich lie in a straight line (k < n). Find the
probabthty that they w1ll occupy k adjacent cells.
Solution .. \ye can choose k adjacent cells out of n inn - k + 1 ways.
~he probabthty ~hat ~he k balls will fall in each group of adjacent cells
1
1s equal to (1/n) k!_ (~mce they can be distributed among the cells ink!
ways). _The probability of the event A = {the balls fall in k adjacent
cells} 1s P (A) = (1/n)" k! (n- k + 1).
42 Appl1ed Pro&~em5 1n Pr()babd ty Theo ry

2 34. One dal a post offu:e recet' ed 20 teleg rams tnt ended for four
dtfferent addres~es (fi' e for r-ach addr ess) Fou r teleg rams are chosen
at rand om Find tho prob ablll lles of the follo ,ving e\ ents
A = {'111 the teleg rams are 1nte nded for dtfferent addresses},
B = {all the teleg ra[n s are 1ntcnded for the sam e addr ess}
Solu tion } or the e' cnt A to occu r the fir~t teleg rafn rnay he 1nlend
e-d for an) addre--;s the adJ.rl-SS of the ~econd teleg ram 1nust d1fter
from that of the first one that of the thtrd mus t d1f£er fro1n the addres
ses of thP fl.rst t" o 1nd th "l.t of the four th mus t diffe r from the addresses
of the first three D\ the tnul trpll c'ltto n rule for prob1b1lttu~s '\ e ha'e
15 10 5
P ( t) ~ 1 I9 IS 17 ~ 0 130

4 3 2
P {B) - 1 19 18 ~~ ~
0 00413.

2 3.i t\ com pule r cons ists of n unit~ The relia blltt y (fail ure free
'Pf!TiDmlance-) oi l he f1tst nn1t iD:r the t 1me T k:S p 1 \h ~ t of tl1e s~c.ond un1t
1s p 2 and so on The nn1ts m 'l.) fad 1nde pend entl\ of one anot her \\he n
any un1t fails the com pute r fails F1nd the prob abil it' that the com
pute r "til fad d ur1ng the ttme T
Ans ncr
n
1- rr Pi
2 36 \Vhcn the Ign1t1on Is turn ed on the eng1ne p1chs up "1th pro
bab1l1t' p F1nd (I} the prob abtll t} that the engLnc pick s up "hen the
tgn1t1on 1~ turn ed on for the ~econd t nne (2) the prob alul ity that 1t 1s
nece~'=-ar) to s" 1tch on the tgnl tion no n1ore than t'\ o t 1mes for the
engi ne to beg 1n '" orkt ng
\us" er ( 1) ( 1 - p) p (2) 1 - ( 1 - p) =:;; (2 - p) p
2 3i Three messages are sent o'er 1. coJn muru catlo n chan nel each
Qf "htc h m T~ be tran smit ted 'vi th dtffe rent accurac} The t ransmt~s1on
of one message can lead to one of the follo " tng e' ents
A1 ~ {the me~sage Is tran smtt ted In a corr ect form}
A 2 = {the message 1s part1all~ dtsto rtcd }
As - {the messagn ts co1npletely dtsto rted }
The prob ab1l tttes of the even ts A 1 A 2 and A 3 are 1\.llO \\ n to equa l p 1
P2 and P3 (Pl + p2 +p 3 ~ 1) Lons 1der 1ng that mess ages maJ he dis
tort ed or tran smtt ted corr ectly tnde pend en ll y of one another- ftnd the
ptnbab1l1t1es of the fo\\m, 1ng even ts
A = {all three messages are t.ran sm1t tcd 1n a corr ect forin}
B = {nt leas t one of the messages ts complete!~ dtsto rted }
C = {no less th 1n t" o messages are com plet ell or part ially dtsto rted )
Ch. 2. Alg ebr a of Ev ent s _ 43

An sw er.
P( A) =p ;, P( B) =1 -( Pt +P 2) 3 ,
P (C )= 3 (p2 P3) 2+Pt (p2 P3)+3•
+
o rad ar un its are sca nn ing a reg ion of spa ce in wh ich a tar ge t
2.3 8. Tw
a tim e Du rin g tha t tim e the fir st :un it ma ke s 2~~ 1 su r-
is mo vin g for 1:.
an d the sec on d un it 2n cy cle s. Du rm g on e sur ve1 lla~ce
ve illa nc e cycles 2
cycle the first un it de tec ts the tar ge~ (in ~ep end ent ly_ ~f the oth e.r um ts)
sec on d um t Wi th pr ob ab ili ty p • Fm d the
wi th pro ba bil ity p 1 , the 2
pro ba bil itie s of the following ev en ts:
A = {the tar ge t is de tec ted in tim e 1: by at lea st one of the
• un its };
e tar ge t is de tec ted by the fir st un it an d is no t de tec ted
B = {th
by the second un it} ;
C = {the tar ge t is no t de tec ted du rin g the first ha lf of the tim e
1:

an d is de tec ted du rin g the sec on d ha lf of the tim e} .


-
1 - P (A ).
So lut ion . P (A )=
A= {th e ob jec t is no t de tec ted by eit he r un it} ;
- Pt) 2n l (1 - P2)2n2, p (A )= 1 - (1 - Pt) 2n l (1 - P2)2n2,
p (A )= (1
P( B )= \1 -( '1 -p 1)2nl] (1 -p 2)2n2,
111
- t 2 l·
P (C )= (1 -- Pt) (1 - P2)n 2
[1 - (1 -- Pt t 1
(1 P2
tec t a
2.39. Du rin g one su rve ill an ce cycle a rad ar ins tal lat ion ca n de
pro ba bil ity p. Ho w ma ny su rve ill an ce cy cle s are ne ed ed for
target wi th
the tar ge t to be de tec ted wi th a pro ba bil ity no less tha n &?
de no te the un kn ow n nu mb er of cy cle s by N. Th e co n-
So lut ion . We
dit ion 11 - (1 - p)ll- &>mu st be ful fil led , wh en ce it fol low s th at
(1 - p) '" ~ 1 - &. Ta kin g log ari thm s, we ha ve
I\' log (1 - p)~log (1 - &),
(2.39)
N~Iog (1 - &)!log (1 - p) .

ge be ing sen t ov er a co mm un ica tio n ch an ne l co ns ist s


2.40. A messa tor ted
of n .sig ns (sy mb ols ). Du rin g the tra ns mi ssi on eac h sig n is dis
en tly of the oth er sig ns) wi th pro ba bil ity p. Fo r the sak e of
(in de pe nd
ili ty, eac h me ssa ge is rep ea ted k tim es. Fin d the pro ba bil ity th at
rel iab
me ssa ges be ing tra ns mi tte d wi ll no t be dis tor ted in
at least• one of the
an y s1gn.
pro ba bil ity tha t on e sep ara te me ssa o-e wi ll no t be
So lut ion . The of
is eq ua l lo (1 - p) 11
the pro ba bil ity tha t at lea st on e ou t
dis tor ted ;

k messages wi ll not he dis tor ted is


P (A) = 1 - [1 - (1 - p)n]k.
tppl ~a Problems in ProbabU ty Theory

2 4t Under the cond1t1ons of Problem 2 40 how many ttrnes must


a message be repeated for the probab111 t} that at least one message
\vxll not be dtstorted to be no less than !f?
Answer By formula (2 39) we ha' e N ~ log (1 - !f)/log [1 -
(1 ~ p)"]
2 42 A Significant message 1s sent stmultaneously over n communi
cat1on channels and ts repeated k times o'er each channel for the sake
of rel•abiltty During one transml"'=Slon a message (Independently of
the other mess1.ges) 1s dtstorted 'vtth probabtltty p Each communica
t1on channel (1 n dependently of the other channels) 13 block.ed UJ>"
'v1 th no 1se 'vt th proba htl1 t y Q a blocked up channel cannot trane~m1t
any me"isage.s F1nd the probabtltt:,. of the e\ ent
A - {a me-;;.sage 1s tr"tnsm•tted In a conect form at least once)
Solut1on 'Ve Introduce the e" ent
B - {a message IS transm1lted over one communication channel w1th
out d 1s tottton at least o nee)
For the e'\er.t B to -occt\r flr~t the c.hannel must not be 4'bloc.ked 1.1.p
With notse 1nd ~econd at least one message sent over 1t must not be
distorted
P (B) - (1 - Q) (1 - p~)
The probahtl1ty of the event A 'vh1ch IS that the event B occurs at
least o-... er one channel ts
P (A) = 1 ~ [1 - P (B)]n ~ 1- (1 - (1 - Q) (1 - pk)}n
2 43 An arr battle bet,veen tv.. o a1rcraft a fighter and a bomber IS
go1ng on The fighter IS the first to fire It fires once at the bomber and
hr1ngs 1t do\vn '' 1th prob a b 1l1 t) p 1 If the bomber 1s not brought clown
1t ftres once at the fighter and hr1ngs 1t do\vn wtth probabilJty p 2 If the
fighter 1s not brought do,vn 1t fires at the bomber again and brtngs 1t
do,vn \VIth probability p 3 F1nd the probabillttes of the folio'' 1ng out
comes
A = {the bomber IS brought down}
B - {the fighter IS brought do"n}
C = {at least one of the aircraft IS brought down}
Ans"er
P (A) - P1 + (1 - P1) (1 - P2) Ps P (B) - (1 - Pt) P2
P (C) = P (A) -t P (B)

2 44 An a1r battle bet" een a bomber and t'vo attack1ng fighters 1s


going on The bomber IS the first to fire 1t fires once at each f1gh t er and
brJngs 1t down \\ 1th probabtllty p 1 If a fighter 1s not bro light do\vn 1t
fires at the bomber lrrespectt\e of the desttny of the other fighter and
Ch. 2. Algebra of Events 45

brings it down with probabilit y p 2 • Find the probabilit ies of the fol-
lowing outcomes:
A = {the bomber is brought down};
B = {both fighters are brought down};
C = {at least one fighter is brought down};
D = {at least one aircraft is brought down};
E = {exactly one fighter is brought down};
F = {exactly one aircraft is brought down}.
Solution. The probabilit y of one of the fighters bringing down. the
bomber is (1 - p 1) p 2 ; the probabilit y that neither of theJ:?. brmgs
down the bomber is [(1 - (1 - p 1 ) p 2 ]2, whence
P (A)= 1- [1-(1- Pt) pzJ2, P (B)= Pi,
P (C) = 1 - (1 - p 1) 2, P (D) = 1 - (1 - Pt) 2 (1 - P2) 2,
P (E) = 2p 1 (1 - Pt).
The eYent F can be represente d in the form F = F 1 F2 + Fa, +
where
F 1 = {the bomber is brought down and both fighters are safe};
F 2 = {the first fighter is brought down and the second fighter and the
bomber are safe};
Fa= {the second fighter is brought down and the first fighter and the
bomber are safe}.
P (Ft) = (1 - Pt) 2 ![1 - (1 - P2) 2 ],
P (F2) = P (Fa) = Pt (1 - Pt) (1 - P2),
2
P (F) = (1 - Pt) ['1 - (1 - pz) 2
+
1 2pt (1 - Pt) (1 - P2).
2.45. The conditions and the questions are the same as in the preced-
ing problem, with the only difference that the fighters attack only
in pairs: if one of them is brought down, then the second one disen-
gages.
Answer.
P(.4)=(1 -p 1)2[1-(1 -p 2) 2 ], P(B)=pi ,
P (C)= 1- (1- Pt) 2 , P (D)= 1- (1- p 1) 2 (1- p 2) 2 ,
P(E) = 2p 1 (1- P1), P (F)= (1- P1) 2 [1- (1- p 2) 2 ] + 2p 1
(1- p 1).
2.46. There are a white and b black balls in an urn. Two players
draw one ball each in turn, replacing it each time and stirring the balls.
The ~l~yer who is the first to draw a white ball wins the game. Find the
proba~1hty 1\ of the fitst player (the one who begins the game) being
the wmuer.
46 Ap p lted Problems n Pro bab l ty Theory

Solut1on The fir~t pla1 er can "1n e1ther 1n the first or 1n the th1rd
t''
selection (1n the latter case the first o t1mes blacl balls must be dr(nvn
and the thtrd t1me a '\h1te ball must be dra\\n) ~nd ~o on

!ft-
a
a+b + ( b
a+b
)2 a
a+b + + ( b ) 2k a
a+b .,a+b-;-

(lt 15 e\ 1dent that &1 > 1/2 for an) a and b)


2 4.7 There are t\vo 'vhtte and three black balls tn an urn T"'o play
ers each dr'l'\ a ball In turn '\ 1thout replacing 1t The first player who
1S the first to dra'\ a "htte bnll ''Ins the game F1nd the probabtl1ty &1
tbat the first pla) er \?Ins the gan1e
So1ution
~+~.!. ~-a
5 5 4 3 5
2 48 An Instrument 1~ being tested Upon ea.ch tr1al the Instrument
falls ' 1th prob1b1l1t} p -\fter the flrst fru]ure the Instrument IS re
pa1red after the second fa1lure It ts considered to be unfit for operatton
F1nd the probabth 1) th1.t the Instrument IS reJected el.actl) 1n the kth
tr1al
Solution For the g1\ en e' ent to occur It IS nece~~ar} first that the
tus trumen t should fa 1l 1n the ;.., th tr1al the prob a b 111 t y of that e' ent
being p In addttton 1t ts nece(O~ar:,. that In the preced1ng k __. 1 trials
the tnst rumen t should fa 1l eA. actll once the prohabtli t y of that e' ent
betng (k - 1) p (1 ~ PV~ 2 The required probahiiit) IS equal to
(A - 1) p (1 - p)tt 2
2 49 111ss1les trre fired at a target The probahll1ty of each m1sstle
h1tt1ng the target IS p the h1ts are 1ndependent of one another Each
Dllc;;.slle "\Vb1ch reaches the target br1ngs Jt do,vn \vttb probability p 1
The mtsstles are fired until the target 1s brought do,vn or the mlSStle
t:>eserve 1s exhausted The reserve con:nsts of n mt~stles (n > 2) Find
the probabtl1ty that ~orne m1~s1les 'v1ll rema1n tn the reser' e
Solut1on \Ve pass to the complementarJ e'ent A- {the \Vbole
reser\ e ts e'.hausted) For A to occur the fi.rst n - 1 mi~Siles must not
bring the target do,vn
-
P(A)-{1-pp1)n 1, P(A)=1-(1-pp 1 )~ 1.
2 50 Under the conditions of the preccdtng problem find the proba
b1l1ty that no le~s than t'\\o miSSiles rema1n In the reserve after the tar
get 1s brought down
Solution The complementary e\ent A = {less than two mtssiles
remain 1n the reser' e} 1s equivalent to the first n- 2 mi~Siles not hr1ng
10g down the target
P (A) =:::: ( 1 - p p 1)n-2 P (A) == 1- (1- p p 1)n-2
Ch. 2. Algeb ra of Event s 47
. . - -. -
2.51. Unde r the condi tions of probl em 2.49 find the proba bility
that no more than two missi les will be fired.
Solut ion. For no more than two missi les to be sent, _the targ~t must
be broug ht down after the first two shots : the proba bility of th1s even t
is equal to 1 - (1 - PP1Y·. . . . .
2.52. A radar unit track s k targe ts. Durm g a surve illanc e peno d
the ith targe t may be lost with proba bility Pi (i = 1, 2, · · ·' k).
Find the proba biliti es of the following event s '
A = {none of the targe ts is lost};
B = {no less than one targe t is lost};
C = {no more than one targe t is lost}.
Answ er.
k k
P(A) = 11
i=l
(1 -Pi), P(B )=1 - IT (1-P i),
i=l
k
P(C) = II (1-p i)+p 1 (1-p 2) ••• (1-p h)+( 1-Pt )P2( 1-ps )
1=1
... (1- P~t) + · .. + (1- Pt) (1- P2) · · • (1- P~t-t) P~t •
The last proba bility can be writt en in the form
k k k
P(C) = II (1-p i) +~ 1!_ip·
1
l1 (1-P i)•
"1
1= "1
J= "1
1=

2.53. An instru ment consi sting of k units is in use for a perio d t


During that time, the first unit may fail with proba bility q 11 the second
witl1 proba biiity q 2 and so on. A repai rman is summ oned to check the
instru ment. He checks each unit and eithe r locate s and clears the fault ,
if one exists , with proba bility p or passes it as faultl ess with proba bility
q = 1 - p. Find tho proba bility of the event A = {at least one unit
remains faulty after the inspections}.
Solut ion. The proba bility that the ith unit remai ns faulty is equal
to the proba bility that it became faulty durin g the time t multi plied by
the proba bility that the repai rman failed to locate the fault: qiq. The
proba bility that this event will occur in at least one unit is
k
P(A )=1- 11 (1-qi q).
i=l
2.54. A new condi tion is added to those of Probl em 2.53: after time t
a repai rman is not found with proba bility Q and the instru ment is used
withot~t inspecti_on. Find the proba bility of the event A = {the instru -
ment 1s used With at least one faulty unit} .
Answer.
m1 + m:a + m,+ m + m13 + m~ 3 + m123 =
12 m T\\0 people arc. cho
sen from the group at random \Vhat P~ tho prob1b1 h ty p that they can
u~e one of the three languages to talk to each of l1er '\ 1 thout an Inter
preter?
Solut1on. '\70 enumerate nil tho seven groupq of people and '" rtle 1n
front of each group the language~ "h1ch all tho member-; of tho group
speaJ... denot1ng then1 h) the letter~ G, t\ and R
(I) G (m 1 people), (1 I) A (m 2 people); (I II) I{ (m 3 people)~ (VI) G, A
(m 12 people) (V} G, R (m 13 people), (yl) i\, 1\ {m 23 people), {VII) G t \
R (rnu: 3 people)
For t \\ o pco pic to t nih. to each other~ they must belong to a pa1r or
groups \\ ntch h.a\ c a language In COnlmOn It lS Simpler lfl tlus ca~e to
find the probab1ltty that t'' o people cannot talk to each other They
must then belong to one of the folio\\ tng patr~ (I~ I I)~ (I .. II I)~ (I~ VI},
(II I I I) (I I~ V) (III IV) Th~ probab1l1 ties that one of the t" o sc
Joctcd people belongs to one group nnd the other to another group aro
I II 2mtm~
p{ ) ~ m {m~t) '

1 IT 2.m1m3 p (T VI)= 2m!m2"3


p( I) == na (m -1) m (m-1)

p {II, HI)= "'2;•:_a1) ' p (II V) == 2m1mt1


n~ (m.-1) ,.
2
P (lilt IV)= m,mu
m lm-1)
Adchng these probabJlttics and subtracting tho rec;ul ttng sum from.
un1ty t '\\oe find the Iequ11ed probabtltty p
2
p =1- m.\m.-t} (m 1m 2 + m m 3 + m~m 23
1 .J m.1m 3 + m'2m 13 + n1 3 m 1.,)

2 60. A factory manufactures a certa1n typo of art1cle Each artic!(t


may hn ve a defect, the prob ab 111 t y of \V htc h IS p A manufactured article-
IS checked successively by J.., Jnspectors, the Lth 1nspector detects a defect
(tf any) '' tlh probabtllt) p 1 (t = 1, 2, , k) If a defect ts detected,
the artl~le; 1s reJ~Cted F1nd tho probabi\ltleC) oi the follo'' tng o' ent.s
A = {an article 1S reJeCted} t
B = {an article IS reJected by the second 1nspec
tor and not by the first} ,
C = {an a.rt1cle lS reJected by all h. tnspectors}
Ans\\er"
~

P(A)=p [1- II (1-p,)l


i~t
t

k
P (B)= P (1- Pt) P:1~ P (C)==== P [J Pl
i~t
' .

Ch. 2. Alg ebr a of Eve nts 51.

2.61. A fac tor y ma nu fac tur es a cer tai n typ e of art icl e. Ea ch .ar tic le
ma y hav e a def ect wi th pr? bab ilit y P: ~na rti cle is che cke d by one ms pec -
tor who det ect s a defect w1th pro bab 1h ty p 1 • If he does no t det ect a de-
fect he let s the art icl e pass. In add itio n, the ins pec tor ma y ma ke
a mi sta ke and rej ect a flaw less art icl e, the pro ba bil ity of wh ich ev en t
is a. Fin d the pro bab ilit ies of the following eve nts :
A = {an art icl e ic:; rej ect ed} ;
B = {an art icl e is rej ect ed by mi sta ke} ;
C = {an art icl e wi th a defect is passed to a lot of
fin ish ed products}.
Answer.
P (A )= PP1 +
(1 - p) a P (B) = ~1 - p) a,
P (C) = p (1 - P1).
2.6 2. Th e con dit ion s are the sam e as in the pre ced ing pro ble m, bu t
each art icl e is checked by two inspectors. The pro bab ilit ies tha t the
first and the sec ond ins pec tor wi ll rej ect a def ect ive art icl e are p 1
and p 2
res pec tiv ely ; the pro bab ilit ies of rej ect ing a flawless art icl e by mi sta ke
are a 1 and a 2 res pec tiv ely . If at lea st one ins pec tor rej ect s an art icl e,
it goes to waste. Fin d the pro bab ilit ies of the sam e eve nts .
Answer.
P (A )= p [1 - (1 - p 1) (1 - P2)l (1 - +
p) [1 - (1 -a 1
) (1 -a )],
2

P (B )= (1 - p) l1 - (1 - at) (1 -a2)1, n. r
p (C )= p (1 - Pt) (1 - P2). rf,.J,B~
2.63. An ins tru me nt con sis ts of n un its (Fi g. 2.6 3), and a fai lur e of
any un it lea ds to a fai lur e of the ins tru me nt as a wh ole . Th e un its ma y

p
- p p p oe • p
' ''- -- -- ... ., ·-~ --- -=-
, -~/
~ . ~'
. v .~
n
~ p
-f iB3{ Fi~: ~:6~. ,___ ' Fig . 2.64

fail ind epe nde ntl y. Th e rel iab ilit y. (th e pro bab ilit y of fai lur e-f ree per -
for ma nce ) of eac h un it is p. Fin d the rel iab ilit y P of the wh ole ins tru -
me nt. Wh at mu st be the rel iab ilit y p 1 of each un it to ens ure the rel ia-
bil ity P 1 of the ins tru me nt?
Re ma rk. Fro m now on ele me nts wi tho ut wh ich a sys tem can no t
operat e are rep res ent ed as lin ks con nec ted in ser ies ; ele me nts wh ich
rep eat each oth er are connected in par all el. Th e rel iab ilit y of eac h
ele me nt is wr itte n in a res pec tiv e squ are .
.~-\nswcr. P = p , p 1 = f!J\.
71
r'"} )_
\ _.:::;.
S' \
~.
52 1pplted PrOblems In Proba.b~htv Theo ry

2 64 To 1ncreaso the re)Ia .btlt ty of an tnstr ume ntt 1t 1s repeated by


another instrument of the same ktnd (F1g 2 64) the relia btlat y (the
prob abtl1 ty of fa1lure free perf orma nce) of each Instrument lS p \Vhen
the first Instrument fa1ls, the second Instrument lS Instantaneously
S\\ ttehed on (the rel1abilit.y of the sw1tch1ng dev1ee 1s equal to un1ty)
Ftnd the rel1 abtlt ty P of a system of two Instruments which repeat
each other
Solut1on For the system to fall, the two Instruments mu~t fa1l
sunulta.neously., the relta bittt y af the syst em P = 1 - (1 - p)~
2 65 The sarn e prob lem, but the relta blltt y of the :;w1 tchtn g de' 1ce
Sw lS p 1 (F1g 2 65)
Answ er The rel1ab1l1ty of the syst em P = t -- (t - p) (1 - p 1Pl
2 66 To Incre ase the relta b1l1 ty of an rnst rum ent 1t lS repe ated by
(n- 1) rn.struments of the sam e kind '(F1g 2 66), the rcl1a btllt y of

~ p 1-o

p -p -

••
........... p f/ 1-
I l J a,.._
p .......
sw
F•g 2 6-l Fig 2.66

each Instr ume nt Js p Ftnl l the relia bilit y P of the syst em Hon man )
tnstr ume nts mus t be used to Incre ase the rel1a b1ltt y to the asstg ned 'alu e
Pt?
Ans l\er P = 1 - (1 ~ p)n n~ log (1 - P 1 )/Iog (1 - p)
2 67 T1le same prob lem but a devt ce \Vllh relia bilit y p 1 1S used to
tnvi tch on ever y stan d by 1nst rum ent (F1g 2 67)
Ans "er
P = 1- (1- P) (1- P1Pr" 1, n;;:: log (t-P~)-log (t-p ) +1
log (t- PtP)

2 68* A syst em cons tsts of n untt s the rel1 abtlt ty of each of \Vhtcb
1s p r\ fatlu re of at leas t one untt lead s to the fa1lure of the \vhole syste m
To 1ncrease the relia bilit y of the syst em 1ts o11eratto.n 1s repe ated , for
wh1ch purp ose anot her n untt s are allo tted The s'vlt chtn g devi ces are
com plete ly relia ble DPte rmJn e 1vh1eh of the fo1lolv!ng repe atin g tech
ntqu es lS the more rel1.able (a) each un1t 1s repe ated (Ftg 2 S8a) (b) the
who le syst em IS repe ated (Fig 2 68b)
Solt1 hon The relra btlit y of the syst em repe ated by tech ntqu e (a) IS
P.. = U ~ (t - P)~tn 1and that of th~ syst em repe ated by tech niqu e (b)
ts Pb = 1 - (1 - pn) Let u.~ sho\v th(lt Pa > Pb for any n. > 1 and
Ch. 2. Algeb ra of Event s 53

0 < p <
1. Since n
p = [1-( 1- p)2]n = {1-1 + 2p- p2]n= pn (2- p) '
p:= 1-( 1- pn)2= 1-1 + 2pn - p2n= pn (2- pn),
2 )n > 2- pn We set q = 1-
it is sufficient to prove t_hat ( - p th f
p (q > 0) and the inequ ality assumes e orm
(2 - 1 q)1' + > 2-

p p p
p • - - ••• -
... p p p
- p, p -
{a)

- p, - p -
p p
>n -1 p •••
•• •
p p ••• p
"- Pt p 1-
(b) '

Fig. 2.67 Fig. 2.68

1 _ q)n or (1 + q)n + (1 - q)n> 2. Applyin~ the binom ial theor em,


we note' that all the negat ive terms are elimm ated:
n (n-1)
(1+q )n+( 1-q) n=1 +nq + 2 q2+ ..•

+ 1-nq +
n(n-1 ) 2
2 q - ...
=2+ n(n- 1)q 2 + ... >2.
and this proves the requi red inequ ality.

I ' I p2
I /)I
'
Ps
II
Pz PJ p4
I I
._l p, I Po r-
Pz m
Fig. 2.69 Fig. 2.70

2.69. In a techn ical syste m not all units are repea ted but only some
of them which are less reliab le. The reliab ilities of the units are show n
in Fig. 2.69. Find the reliab ility P of the syste m.
Answer. P = [1 - (1 - p 1 ) 2 1[1 - (1 - p 2 ) 3 ]p 3 p 4 [1 - (1- p ) 2 ].
5
2. 70. An instru ment consists of three units . The first unit conta ins n 1
elements, the second conta ins n 2 eleme nts and the third conta ins n 3 ele-
5~ Apphed Problem3 in Probabitl.ty Th.-eory

ments For the tnstrument to functton, un1 t I ts obviously necessar}


the t''
o other units 11 and 111 repeat each other (F1g 2 70) The re1Ia
hlllty of the elements lS the samo and equal to p A failure of one e1e
ment means a f atl uro of the 'vholo ttni t The elements far I tndependentl~
of one a no tber F 1nd the rella bill ty P of tho 1nstrument
Solnfton The rel1abJI1ty oi untt I 1s PI = pnt, the reltabtltty of
untt II lS p 1 1 = pn2 the rel1abtl1ty of untt Ill IS Pili = pnl, the reha
b1l1ty of the repeaters (ll and Ill} 1S 1 - (1 - p"t.) (1 - p"a) the
reliabJlity of the 1nstrument ts
P~pnt(1-(1-pn•)(1-pha)]

2 7J There IS an electr1cal device whtch can fail (burn out) only at


the moment when 1t 1s s\VItched on If the devtce bas been s'vttched on
k - 1 ttmes and has no-t burned QUt, then the eond1t1onal probab1ltt~
of Its butntng out on the ktb S'\ 1tch1ng operation 1s Q4 ftnd the proba
bll1t1es of the follo,v1ng events
A = {tha de\ tee 'vJtbstands no less than n swttch1ng
opera t1ons},
R = (the de, 1ce 'v1thstands no more than n switching
opera trons},
C = {the devIce burns out e"t actll on the nth s" 1t chJng
operation}
Solution The probabtltt} of the e"\ent Jl 1s ~qual to the prababtl1tl
that the dev tee '' tll not burn out on the fus t n sw1 tchrng o perat 1ons
n
p (;1} = II
h-1
(1-QA)

To fi~d the probnb1l1ty of tha event B '' e pass to a compleroentarl


e'en t B ~ {the de, 1ce w1ll wt ths tand more than n s'vl tchtng opera
t1ons} For that event to occur~ 1 t 1s suffi.ctent that the devtce should
not burn out on the first (n. +
1) operatrons
n+ I n+:S
P(B)~ II
k~1
(1-QA)t P(B)=i- IJ
~~~
(1-Q~t)
For the devtce to burn out exactly on the nth s\VItthlng operat1on 1t
IS nece~sary that 1t ti:hould not burn out on the first (n -1} opetat1ons
and should burn out on the nth operation
n.-1
P (C)=Qn n {i-QA}.
k~t

2 72 An Instrument consts ts of four un1ts t \\ o of them (I and I I)


a.re obv1ousll neccs~.ary for the Instrument to fuuctton and the ot.hct
t'\\ o (III and IV) repca t e1.ch other (FJg 2 72) The un1 ts can only fa11
\\hen the) are S\\o 1tchtd on On the kth S\Vttch1ng operation un1t 1 fatlc
Ch. 2. Algebra of Events 55

(independently of the other units) '':ith prob.abil~ty qr (k), unit II wi~h


probability qn(k), unit III and umt IV fall w_I~h. the same probabil-
ity qn 1 (k) = q1 v(k) = q (k). Find the probabilities of the events A,
B C as in Problem 2.71.
'solution. The problem reduces to the preceding problem when t~e
conditional probability Qk that a sound device will fail on the kth swit-
ching operation is found: Qh = 1 - [1 - qr (k)] [1 - qn(k)l X
{1 _ (1 - q (k)) 2 ], and is substituted
into the solution obtained. III
2. 73. An instrument consists of three
units. When the instrument is switched I II
on, a fault occurs in the first unit with c IJ!
probability p 1 , in the second unit with
probability p 2 , and in the third unit Fig. 2.72
with probability p 3 • Faults occur in the
units independently of one another. Each of the three units is obvi-
ously necessary for the instrument to function. For a unit to fail com-
pletely at least two faults must occur. Find the probability of an event
A = {the device withstands n switching operations}.

Solution. For the event A to occur, it is necessary that all the three
units should function. The probability that the first unit withstands n
switching operations is equal to the probability that no more than one
fault (0 or 1) occurs when it is switched on n times: (1 +p 1 )n +
np 1 (1 - p 1)n-l. The probability that all the three units withstand
n switching operations is
3
P(A)= IJ
i=1
1(1-pi)n+npi(1--pi)n-f].

2. 74. An instrument consists of three units, only one of which is


necessary for the instrument to function, the other two repeat each other.
Faults. only occur in the instrument when it is functioning and can
occm m any of the components constituting the units with the same
probability. The first unit consists of n 1 components, the second of n,
components, t~e third of n 3 components (n 1 + +n2 n 3 = n). When
a fault occurs m at least one component the unit fails .
. Four faults arc knowT_l to have occurred in the instrument (in four
~tfforenl components). Fmd the probability that the:;;e faults prevent the
mstrument from functioning .
. Solution. An event A = {the instrument cannot function} is divided
m two: A= A 1 + A 2 , where

A 1 = {the first unit fails};


A 2 = {the lirst unit does not fail but the second and
the third do}.
56 Appllrd Problem s in Probabtltty Theory

For the event A 1 to occur t t t ts necessa r) thn t at least one of the four
faults be ID tho Drs\. Unlt
- n-n 1 n-n 1 -1 n-n 1 -2 n.-n 1 -3
P(A,) =1-P (At)= 1---n n-1 n-2 n-3 •

To ftnd the probab ll1ty of the event A 2 tho probab tltty of the event
A1 = {the first un 1t does not fo.tl} must be m u1tJ plJed by the probab1l·
1ty that the 6teco.nd and th1rd nn1ts f11l (\\ 1th rl ue regard for the fact
that 1\l the four faults occur tn the second and th1rd Units) The last
event may be 1n three varJllnt~ e1ther one fault occurs 1n the second
un\l and the oth-er tl1ree \n the tb1rd un1t oy con'\ ersely; three faults occur
1n the second un1t and ono fault occurs 1n the th1rd un1t, or else tv.o
faults occur In the second and tn the th 1rd un1t each
The probo.b1h ty of the first vart 'l.nt 1s
Cl n, n,
l n 3 +n 3 n~+ n 3 -1

That of the second var1an t iS


C1 n" n1 n2 ~ 1
4 11 2 +n 3 n~ + na -1 n~ +n 3 -2

Hence
p (tl2) - [ 1- p (At) 1{f!J I + !f z + ff 31 p (A} = p {.tit} = p {A2)
2 75 An Instrum ent \Vhtch must he 'cry rei table rs assemb led from/..
parts Dt D 2 D h Before the as:,em b1y sto.rts eacl1 part I.S Inspect ed
thoroug h 1y "lnd 1 f 1 t pro' es to be of 'ery htgh qual1 t y IS tnc 1uded
rn the tnstrum ent and lf not 1s re.pl ~ced h y a no.thet prrrt, ~ l\1ch tS
also Inspect ed There Js a stock of spare parts of each type m t parts ot
k
type D ~ (1. === 1, t k), a total of m ~ ~ m, parts If there are
1-1
not enough spare parts, the assemb ly lS postpo ned The probab 1ltty
that a part of typeD , proves to be of high quality IS equal to Pl and does
not depend on the quality of the other parts F1nd the probab tllttes
of the followi ng events
A =
{the stock of spare parts ts suf&c.Lent for the assemb ly
of the 1nstrument},.
B = {all the spare parts have been used (tested end
Included •n the Instrum ent or only tested) },
C ~ {ustng the a"\ a1lable stock of spare parts the fitter~
"dl assemble the Instrum ent and at least one spare
part \vJll rema1n}
Ch. 2. Algebra of Events 51

k
Solution. The event A = II A i, where A i = {at least one part
i=i
of type Di proves to be high quality}.
k
P(A;)=1-(1-Pi ti, P(A)= .IT
t=1
[1-(1-PittJ.

k
The event B = II Bi, ,\rhere Bi = {all the p<lrts of type Dt have been
i=1
used}. For event Bi to occur, the first m; - 1 parts of type Di must be
low quality:
m.-1
P(B;)=(1-p;) I '

To find P (C), we represent the event A as a sum of mutually exclu-


sive events
A= C F, + (2.75)
where F = {the titter manages to assemble the instrument but no
k
spare parts remain in stock}; F = lJ F;, where Fi = {the first 1ni - 1
i=1
parts of type D i prove to be low quality and the mith part proves to
be high quality}.

From formula (2. 75) we have


k k
P(C)=P{A)-P{F )= Il
i=1
[1-{1-pi)mi]- II [{1-·Pi)mi- fp;J.
i=1
1

2. 76. A device S consists of n units, each of which may end up in


one of the following states as a result of its operation for the time -r:
s1 = {the unit is in good order};
s2 = {the unit must be adjusted};
s3 = {the unit must be repaired};
s4 = {the unit is out of order}.
P (st) = Pu P (s2) = P2• P (sa) = Pa• P (s4) = P4•
P1 + P2 P3 P4 = 1. + +
The states ?f separate units are independent. Find the probabilities
of lhe followmg events:
A = {all the units are in good order};
B = {all the units must be adjusted};
58 1ppl fd I rolle ms tn Prob ahtliJ j Theo ry

C = {one un1t mus t be rcpa 1red and the othe r untt s mus t
bo adJu sted }.
D - {at leas t one un1t 1~ coin plcte ly out of orde r},
E = {t\\ o unlt s mu~t bo ad JUs ted ono untt must be repa1red"
and tho othe r unt ts are 1n good order}

Solu tion p (A) = p:'


I) (n) - Pi One unIt ' 'v htch Ill us t be repaired,
CAn be chosen In C~ ~ n '\ay. s

f\\ U Unit~ '' h1ch mUS t be 1dJUSlcd CUll be ChOS OD OUt Of n UrUl S In
C2 n (n.l-1 ) "'l)'S one un1t "h1c h mus t b(} rcp1 rrcd can be cho"en
1n C.~ 2 n ~ l \\ ays
P(E )= n(n l-t) (n- 2)p :pl I

2 77 Sl~ me~s 1ges aro sent o' cr 1. corn mun tcatl on chan nelt each of
\\ htch utde pend entll of othe r mess ages may be dtsto rtod \Vtlh prob ahllt ty
p = 0 2 F1nd the prob abtll ttes of the folio " 1ng even ts
C {exa ctl) t\\ o rnes.,agcs of the stx are d tstor ted} t
=
D = {no le5~ than t\\ o mes'iagcs of the s1 x arc disto rted }
Solu tton By the theo rem on a repot1 tton of tr tals Isee form ula (2 0 15)]
P (C) = P 2 6 = C! 0 22 0 8 ~ 0 197 By form ula (2 0 18) \Ve ha'e
P (D) = 1 - (Po s + 1)1 6 ) = 1 ~ (0 86 + 6 0 2 0 85 )
= 1 -- (0 262 + 0 393) = 0 315

2 78 A tota l of n mess ages aro sent over a com mun tcatt on channel
Each of thern may be d 1sto.rted by In terferenco 1nde pend en tly of the
othe r messages \Vlth prob abtl1 ty p Ftnd the probab111t1es of the fol
lowt ng e\en ts
A = {exa ctly m of then mess ages are d1st orted }i
B = {no less than m messages of the n are tran smtt ted
'v1 thou t d1stortton} 1
C = {no more than half of the messages are dtsto rte d},
D = {all the messages are tran smtt ted \V1thout d1st ort1o n},
E· = \no ,less -i'han t\vo messages are disto rted }
Solu tion By the theo rem on a repe t1tro n of tr1als [see form u
la (2 0 15)1
Ch. 2. Algeb ra of Event s 5!:1

Using formu la (2.0.1.7) and repla cing p by 1 - p, we obtai n


n [n/2]
p (B)= 2j C~ ('1- p)k pn-", p (C)= 2J C~p'' ('1- p)n-k,
~ k=O
h.=m
where [n/21 is an integ ral part of the numb er n/2.
P (D)= (1- p)11 ,
the same value can be obtai ned from formu la (2.0.1 5).:
P (D)= Po,n = C~ 0 (1- p) 11 = 1·1· (1- pt.
By formu la (2.0.18) we find that
1
' = R _ 1- ~ Chj h (1- p)n-1<
P (E) q n --- .L.J n 'J
·• h=O

= 1-[( 1- p) 71 +np (1- pt- 11


= 1- (1-- p) 11- 1 [1 + (n-1 ) p].
2.79. A device S consi sts of five units , each of which may fail durin g
its opera tion with proba bility p = 0.4: Units fai~ indep enden tly ?f o~e
anoth er. If more than three units fa1l, the device canno t funct wn; If
one or two units fail, the device funct ions but with a lower efficiency.
Find the proba bilitie s of the following event s:
A = {not a single unit fails in the device};
B = {the device can funct ion};
F = {the device funct ions with a lower efficiency}.
Solut ion.
P (A)= 0.65 ~ 0.0778, P (B)= P (A)+ P (C)+ P (D),
where C = {exac tly one unit fails}; D = {esac tly two units fail}.
P (C) =
P 1 , 5 = 5·0.4· 0.64. ~ 0.259 ,
P (D)= P 2 ,5 = C!·0. 42 ·0.63 ~ 0.346,
P (B) = P (A) +
P (C) + P (D) ~ 0.683,
P (F) = P (C) +
P (D) ~ 0.605 .
2.80* . There is a conte st betwe en k mark smen , each of which fires n
shots at his targe t. The proba bility of hittin g the targe t on one shot is
Pi (i = 1, ... , k) for the ith mark sman . The mark sman who hits a tar-
get the most wins the conte st. Find the proba bility that one and only
one mark sman will be the winne r.
Solut ion. Any of the k mark smen may win. We find the proba bility
thal the ilh mark sman is the winne r (even t A,). This event can occur in
the fol~owing ways: A <f> = {the ith parti cipan t hits the targe t exact -
ly m t1mes and each of the other partic ipant s make s no more than
m- 1 hits} (m = 1, ... , n).
60 Appl~t!d Problema ln. ProbabiUtv Tht!ory

The prohab1l1ty Pm. (i) lhat the tth marksman makes m htts ts equal
to C'/:p":q:-n\ 'vhere qi = 1 - p, \Ve destgnate the probab1hty
that the Jth m1rJ...sman makes no more than m- 1 htts ns Rm U1
m-t
Rm (]) = 2j C~p;qj-' (m> 1).
t==O
Then the pro ba b 1l1 ty that all the o tber part tel pants, except for the
tth, make no more than m --- 1 hlts, lS
Rm-1 (1) Rm_1 (2) • Rm-t (t -1) Rm-t (1- + 1) • • Rm-t (1') == TI Rm-t {J)
t+•
Summing up the probabllttres obtained for all the values of m, we
get the probab1l1ty that the lth part1c1pant lS the only winner of the
contest
n
P (At) =- ~ P m ( ~)
m=t
nR
J+l
m-t (J) (' = 1, , h-)~

Summ1ng up these probabtl1\1es "for all the pa:rilClpants, we obta1n


k k n
P(A)= lJ P(A,)= t..-t
f~t
~ Lj
m~l
Pm(i) [1
J*f
Rm-t(J)

2 ~81. A f amtl y who got a ne'v flat 1nstall ed 2k new elec tr1c lamps In
the course of a year each lamp may burn out w1th probabtltt} r F1nd the
proba h1lt t y of an event A = {1n the course of a year no les;:, than half
of the lamps wtll have to be replaced}
Answer ..
k-t
P(A)=1- ~ C2hrm(1-r) 2 ~t-ma
m=O

2.82. A factory manufactures art1cles, each of whtch, Independently of


the other articles, may have a defect 'v1th probab1ltty r When an artt..
cle ts Inspected t any defect may he detected '"1
th probab1l1 t y p The
checktng operation cons1.sts rn choosing n art1cles for tnspect1on F1nd
the probabiltties of the follo,v1ng events
A =
{no defects are detected to any of the articles},
B = {a defect IS detected 1n exactly two of the n arttcles}
C = {a defect 1s detected tn no less than t'vo of the
n arttcles).
Solution. The probabilJty that one article chosen at random conta1ns
a defect 'vhtch 1s detected 1s pr
P (A)= (1- pr)n, P (B) = C~ (pr) 2 (1 - pr}n- 2 ~
P (C) = 1 - (1 - pr)n-l [(1 - pr) nprJ +
2.83 .. The conditions are the same as 1n Problem 2 82 but no'v the
whole batch of art 1cles chosen for the checktng opera t1on 1s reJected 1f
Ch. 2. Algebra of Events 61

no less than four articles among the chosen n have defects .. Find ~he
probability p that the whole batch of articles chosen for mspectwn
is rejected.
Ans \"H.
n
p= 2J C~(pr)i(1-pr)n-i,
i=4

2.84. Given two instruments, the first of which consists of n 1 units


and the second consists of n 2 units. Each instrument operates for
a time t. In the course of that time each unit of the first instrument may,
independently of the other units, fail with probability q1 and each unit
of the other instrument may fail with probability q2 • Find the proba-
bility p that during the time t m 1 units will fail in the first instrument
and m 2 units will fail in the second instrument.
Answer.
-- cml ml (1 qi )nl-ml cm2qffl2 (1 - q2)n2-m2 •
P n1 q1 - n2 2

2.85. A coin is tossed m times. Find the probability that the heads
will occur no less thank times and no more than l times (k~ l~ m).
Solution.
l l l
p= 2J
i=R.
pi. m = 2J c~ (1/2)m =
i=k
( 1/2)m 2J
i=k
c~.

2.86. An instrument which consists of k units operates for a time t.


The reliability (the probability of failure-free performance) of each
unit during the time t is p. When the time t passes, the instrument
stops functioning, a technician inspects it and replaces the units that
failed. He needs a time T to replace one unit. Find the probability P
that in the time 2-r after the stop the instrument will be ready for
work.
Solution. For the conditions of the problem to he fulfilled, it is neces-
sary that no more than two units should fail during the time t:
P=ph+k(1-p)pk- 1+ k(k;-1) (1-p)Zpk-2 •

2.87. What is more probable, when playing with a person of the same
ability, to win (1) three games out of four or five games out of eight?
(2) no less than three games out of four or at least five games out of
eight?
Solution. (1) A = {winning 3 games out of 4}; B = {winning 5 games
~u(;)f. 8}. P (A) = q (1/2)~ = 1/4, P (B) = C~ (1/2) 8 = 7 /32; p (A)>

(2) C =_{winning no less than 3 games out of 4}; D = {winning no


le_ss than :l games out of 8}; P (C) = c; (1/2)4 +(1/2)4 = 5/16; p (D) =
c~ (1/2) 8 + c~ (112) 8 + ~~ (1/2) 8 +(112) 8 = 93/256; P (D)> p (C).
~.88. A_ person be~o~gmg to a certain group of citizens may be dark-
h:ured w1th probah1hty 0.2, brown-haired with probability 0.3, fair-
62 Applled Problems tn Probabtl ty :rheory

haired \Vllh probabtllt~ 0 l_t an l reu hatred'' lth probablll.t)' 0 1 A group


of ~ 1 x people ts ~elected nt random F1nu the probabtlttleS of the fol
lo\\ 1ng e' ents
A - {there are no I c~s than four f 11r ha trod people 1n the
chosen group}
B - {there lS no lc~s thon one red hatred person 1n the group}
C ~ {there ~~ an equal number of fa1r haired and brown
h1.trcd people 1n the group}
Solution
P {A) ~ 1 - \0 \)« + () X 040 \.)5, + 15 0 42 X 0 61 )
X ~ 0 4.55
p (ll) - t - (1 - 0 1)6 ~ 0 468 C =Co + C1 + C2 + C3
''here
{tl ere arc ne 1thcr f1.1r hn1red nor brown h a1re d
pcoplo 1n the group}
C1 = {there 1s ono fatr haired and one bro" n haired
person In the group and the other people are
nc1 t her fa 1r ha1 red nor bro" n h 11red}
C2 ~ {there are t\vo fa1r hatred and t'\\ o bro'' n hatred
pcoplo 1n the group nnd the other p-eople are
neither fn1r haired nor bro\\ n hatred}
C3 = {there nrc three Ia1r ha1red and three bro,vn
haJred people tn the group}
P (C 0) = (1- 0 7) 6 ~ 0 0007
61
P(Ct)- 03x04(1-07)~~00292
11114
P (C 2) - 21 ~i ?J 0 32. X 0 4. 2 (1-0 7)2 ~ 0 1166

P (C 3) -
3
G~! 0 33 x 0 4 3 ~ 0 0346 P (C) ~0 t8f

2 89 N Instruments o-perate for a pettod o[ ttme t Each lnstruurent


has '1 rcltab!l1ty p and fntls Independently of the other Instruments
F1nd the prob1b1ltty P (A) that the repairman who IS called after the-
time t to repa1r the faulty tnstruments "1ll not make the repa1rs tn
a tIme 't 1f he nee Is 1. t 1me 't 0 to rep a 1r one faulty Instrument
Soluhon The c' ent A cons1sts 1n the number of faulty Instrutncnts
larger th1.n l - I -r/"'t 0 ] 'vhere [,;/-r6] IS the greatest Integer tnclude l 1n
Tlt: 0
N
P (A) ~ ~ C~ (1.- p)m p\ -m
?"n-l+ 1
2 90 N faulty Instruments arc tested to loc1te a defect In each test
tndependcnt of tho other tests a defect may be located wttb probab1lt
Ch. 2. Algebra of Events 63

ty p. Whon a defect has been located, the instrument is passed to a r~­


pair shop and the other instruments are tested. \~hen the def~cts m
all theN instruments are located, the tests are termmated. The mspec-
tion procedure only allows n tests (n > N). Find the probability that
the defects in all the N instruments will be located.
Solution. A = {all the defects are located} = {the defects are lo-
cated at least N times}.
n
P(A)= 2J C~pm(1-p)n-m.
m=N
2.91. The conditions are the same as in the preceding problem. Find
the probability that as a result of n tests at least k instruments of the
N remain with unlocated defects (k < N).
Solution. The problem is equivalent to fmding the probability that
in n tests the defects are located in no more than N - k instruments.
N·-k
p (A)= 2J C~ pm (1- p)n-m.
m=O

2.92*. An instrument consists of n nnits. The probability that the


ith unit will fail is Pi (i -= 1, 2, ... , n). For the instrument to function,
all the units must operate without failure. To calculate the probabili-
ty R that the instrument fails, the probabilities Pi (i = 1, 2, ... , n)
are approximated by their arithmetic mean:
n
~ 1 "" Pi·
P=-LJ (2.92)
n
i=l

vnu ~

the approximation R of the probability R be greater or smaller


than the true R?
n
Solution. The exact value R = 1 - l1 q;, where qi =1- Pi·
i=1
The approximate value (in the mean probability p)
n n
R=1-(1-p)n= 1-[ +(n- ~ t=1
Pi)T ='1-[ +~
i=l
qiT·

n n
\Yo must compare the quantities _IT qi and [ f ~ qi]n. The geo-
. t=l i=1
met.nc 1~1ean o!
unequal positive values is known to he smaller than
their anthmettc mean, whence it follows that
n / n n n n

l1 i~1 gi < +~1 q;, JJ qi < [ + ~ qr,


t=1 i=1
and consequently Ji < R.
A p p l i~d Pro blem1 fn Pro babtUty Theory

2 93 The articles a factory manufactures must each undergo four


tests An arttcle may pass the f1rst test safely \Vlth probabtltty 0 9, the
second test \Vtth probabtl1ty 0 95f the th1rd test \Vtth probahlltty 0 8
and the fourth test \VI t h pro b ab 1l1 t y 0 85 Frnd the pro ba bIll ty that
the art1cle wxll pass
A = {all the four tests} t
B = {exactly t'vo tests},
C = {at least two tests).
,6\nswcr P (A) ~ 0 581, P (B) ~ 0 070, P (C) ~ 0 994
2 94 A message cons1sts of n symbols and to Increase the rel1ah1hty
of transmtsston each symbol lS repeated m ttmes A val1d symbol at the
rece1v1ng end 1s that wh1ch IS repeated at least k times out of m If, at
the receiving end, a symbol ts repented less than !~.. tunes, the symbol
1s not recognrzed and ts considered to be d1storted 1 he probab1l1t} p of
any symbol being transm1tted correctly IS the same and does not depend
.on \V hetber the other symbols are recogntzed ft nd the pro babtll ttes of
:th .:- foJJ ow1ng events
A = {a symbol 1n the message 'vtll be recognized at the
rcce1 \ tng end}
B = {the ''hole message 'vdl be recognized at the receiving
end}
C == {no moro than l S} mbols v.. til be distorted In the
message}
Soluhon For a symbol to be recogntzed dt the reco1 VIng end 1t must
be repeated 'vtthout dtslortiou no le~~ than h. t1mes out of m The prob
abrhty tha. t a symbot \VJH bo rep rod uccd correctly 1 times out of m
i1n •ccordanee 'Vl t h form u l il (2 0 15) 1 IS
c~~l J (1 - p )m-J
The probabtltt} that a symbol \vlll be transrn1tted In a correct form
no less than /., t1mes out of m can bo calculated by formula (2 0 17)
m
P (A)= lj C~ (t- p)m-} p'
;-k
.or for a comparatlvely smaU k < 1nl2 b) formula (2 0 18)
It 1
P (A)= 1- lj C~ pJ (1- pyn-J
;-0
for the \\hole message to be recognized at the rece1 v1ng end all n sym
bols must ba reproduced correct I y
P (B} = [P (A)111
The prob ahll1 t) of the event C can be calculated by the formula
l
P (C)~ ~ I 1- p (A)J' [P (A)]n-1.
l=D
CHAPTER 3

The Total Probability Formula


and Bayes's Theorem

3.0. If n mutually exclusive hypotheses H 1 , H 2 , ... , Hn can he made c?ncerning


the staging of an experiment, and if an event A can occur only together w1th one of
these hypotheses, then
n
P(A)= 2j P(Hi)P(AIHi), (3.0.1)
i=1
where P (H·) is the probability of the hypothesis Hi: P (A I Hi) is the conditional
probability tof the event A on that assumption. Formula (3.0.1) is known as the tot,al
probabillty formula.
If, before the experiment was carried out, the probabilities of t~e hypotheses
H 1 , H •• ... , Hn were equal toP (H1 ), P (H 2), ••• , P (Hn), and the expenment resulted
in an· event A, then the new (conditional) probabilities of the hypotheses can
be calculated by the formula

P (HI· I A) = _ _..:,.P....:..(H~i
n
)_P_;(~A..:...IH--!:.:..i) _ (i=1, 2 ... , n). (3.0.2)
2J P (Hi) P (A IHi)
i=1

Formula (3.0.2) expresses Bayes's theorem or the inverse probability theorem,


The initial probabilities (before the experiment) of the hypotheses P (H1 ), P (H 2) •
... , P (Hn) are called a priori, or prior, probabilities and the post-experiment prob-
abilities P (H1 1 A), P (H 2 I A), ... , P (Hn I A) are called a posteriori, or inverse,
probabilities. Bayes's theorem makes it possible to "revi'3e" the possibilities of
• the hypotheses regarding the result of the experiment .
If an experiment resulting in an event A is succeeded by one more experiment as
a result of which an event B may occur, then the probability (conditional) of the
event B can be calculated by the total probability formula into which new proba-
bilities of hypotheses P (Hi 1 A) are substituted instead of the former ones
71

P(B\A)= 2_j P(H;!A)P(B!HiA). (3.0.3)


i=i
Formula (3.0.3) is sometimes called the "formula for the probabilities of future
events".

Problems and Exercises


3.1. We have three identical urns. The first urn contains a white
' and b black balls, the second urn contains c white and d black balls
••
and the third urn contains only white balls. A ball is drawn at rando~
from one of the urns. Find the probability that the ball is white.
Solution. Assume that an event A = {an appearance of a white
ball}. 'Ve formulate the hypotheses:
H1 = {the first urn is selected};
H 2 ={the second urn is selected};
66 Apvt ~d P oblem1 n Probab 1 l!J Theor J

/ /3 ={the thtrd urn IS ge]ected}


1 a
P (// 1) - P (Hz)- P (H 3 ) ==== 3 P (.ttllls) = a+b

P ( liH2)= c~d P (Aill3 ) -1

By the total probablltt) formula "e have

P(A)-i- a~b -+-{- c~d +{- 1 =+{ a~b + e~d + 1)


3 2 An Instrument can operate undet t\vo kinds of cond1llons
nortnal and abnormal Normal condtttons are obser' cd tn 80 per cent
of all cases and abnormal 1n 20 per cent of cases The probability that
the Instrument f1.1ls dur1ng a t1me t 1n normal condtt1ons IS 0 1 and In
abnormal condtttons tt may f'ul '\ tth probabtltt} 0 7 Ftnd the total
proh'lhi}Ity p that the Instrument fails durtng the t1me t
Solution p - 0 8 0 1-r 0 2 X 0 7 - 0 22
3 3 The articles a factor1 1n1.nufactures may each have a defect
w1th probablltty p There are three Inspectors 1n the shop each arttcle
is only checked by one Inspector (by the ftrst the second or the thud
tnspector wtth equal probab111 ty) The probab1l1ty of detect1ng the
defect (tf any) for the tth Inspector rs p 1 (~ = 1 2 3) If an article 1s
not reJected 1n the shop It passes to the qual1ty control department of
the factory 'vhere a de feet may be de tee ted \Vl th probab tlt ty p 0 F1nd
the probablltt1es of the followtng e"\ ents
.A = {the art tcle 1s reJeCted}
B - {the arttcle IS re1 ected 1n the shop}
C = {the arttcle IS reJected by the quality control
department}
Solutton Stnce the e\ ents B and C are n1u tually e~clust' e and
+
A= B C 1 t follo\vs that P (A) - P (B)+ P (C) '\ e find P (B} For
an art1cle to be reJected In the shop fust 1 t must have a defect
and second the defect must be detected By the total probabthty
formula the probab1hty that the defect \\lll be detected ts -} (p 1 -1-
1
p1 +p 3) hence P (B} ~
3 p (p 1+ p,. + p 3)

S1mdarly P (C)= p [ 1 - -} (p + p + p,) Jp


1 2 0 or usmg the nota
1 - - -
tlon T(Pt+Pl+P3)-p \\e have P(B)=PP P(CJ = pp 0 (1 - p)
Vwh~nce P (A)~ p lp
+Po (1 - p)]
3 4 'Ve h 1."\ e t" o urns The first contains a 'vh1te and b blacl\ balls
and the second contaans c '' htte and d black balls A ball ts dra,vn at
random from the fi.rst urn and put 1nto the second Then a ball Is dta,vn
from the second urn F1nd the probabtl1ty that the ball ts whtte
Ch. 3. Total Proba bility Formn la and Bayes 's Theor em 67•

Solution. An event A = {an appea rance of a white ball} ; the hypo -


theses are
H ={a white ball has been transf erred };
1
H ={a black ball has been transf erred }.
2
a b
P(H 1)= a+b , P{H2 )= a+b '
c+i
P(AI Ht)= c+d+ i '
c
p (A!Hz) = c+d+ i '

p (A)= a+b
a c+i
c+d+ l
+ b
a+b
c
c+d+ i '
3.5. On the hypot hesis of the prece ding probl e:n .three balls are
drawn from the first urn and put into the secon d (1t IS assum ed that
a~ 3, b ~ 3). Find the proba bility that a white ball will be draw n
from the second urn.
Solution. We could advan ce the follow ing four hypot heses :
H1 = {3 white balls were transf erred };
H 2 = {2 white balls and 1 black ball were transf erred };
H" = { 1 white ball and 2 black balls were trans ferred };
H 4 = {3 black balls were transf erred }.
But it is easier to solve the probl em havin g only two hypo these s, viz.
H 1 = {the ball drawn from the second urn belon ged to
the first urn};
H 2 = {the ball drawn from the second urn belon ged to the
second urn}.
Since three balls from the second urn belon ged to the first urn and
c + d balls belonged to the secon d urn, it follows that P (H1 ) =
3/(c + +d 3), P (H 2 ) = (c +d)/ (c + +
d 3).
Let us find the condi tiona l proba biliti es of the event A = {a white
ball is drawn from the second urn} on the hypo these s H 1 and H 2 • The
proba bility of an appea rance of a white ball belon ging to the first urn
does nol depend on wheth er the ball was draw n direc tly from the first
urn or after it had been trans ferred to the secon d urn. There fore,

P(A!H1)=a~b, P(AI H 2)= c~d,


whence
P(.J) = 3
c+d+ 3
a
a+b
+ c+d
c+d+ 3 c+d ·
c

3.6. We have n urns, each of which conta ins a white and b black
balls. A ball is draw n from the first urn and put into the secon d, then
5"
68 tppll~d Prabl~ms tn Probability TherJry

a ball IS dra\\ n from the ~econd urn and put Into the th1rd one and so
on Then a ball 1s dra,vn from the last urn F1nd the probablltty that
the ball IS v. h1te
Solut1on The probabtltty of an c' ent A 2 = {a \\ hite ball JS dra'WD.
from the second urn after the transfer} can be found In the same \\ay as
tt v~as done 1n Problem 3 5 (for c = a, d = b)
a a+ 1 b a a
p (A2) = 4+b 4+b+ 1 + a+b a+b+ 1 = a+b

Thus the probabtl1ty that a \Vhlte ball IS dra\vn from the second urn
after 1t has been transferred lS the same as before the transfer Con
sequently the probab1l1ty of draw1ng a white ball from the th1rdt the
fourth the nth urn 1s the same
P (An) = a/(a + b)
3 7 Instruments of the same h.1nd are manufactured by two factor1es,.
the first factory produces 2/3 of all the 1nstruments and the second fac-
tory produces 1/3 The rel1ablltty of an tnstrument manufactured by
the first factory 1s p 1 and that of an tnstrument manufactured by the
second factory 1s p 2 F1nd the total (average) rel1abtlity p of an 1nstru
ment produced by the t'' o factories
Ans\\er
2/3p1 + 1/3 P2
3 8 There are two batches of art1cles of the same ktnd The first batch
conststs of N art1cles of whtch n are faulty The second batch cons1sts of
llf arttcles of whtch mare faulty K articles are selected at random from
the fi.rst batch and L articles from the second (K < N, L < llf) These
K + L arttcles are mixed and a new batch ts formed An article IS
selected at random from the mt"ted batch F1nd the probabtltty that
the art1cle 1s faulty
Solution An event A = {the article 1s faulty} The hypotheses
H 1 = {the arttcle belongs to the first batch},
H 2 = {the arttcle belongs to the second batch)
P(H,)= K~L I P(H2)= K~L • P(A)= K~L -N + K!L ~I
3/ On the hypothesis of the precedmg problem three articles are
selected from the new. mtxed, batch Find the probability that at
least one of the three articles ts faulty
Solut1on The hypotheses are
II o = {the three arttcles belong to the first batch},
~ 1 = '(two articles ;Delong 'to -the lirst ·hatch ana one
to the second batch},
ll2 = {one artxcle belongs to the first batch and two
to the second batch},
Ch. 3. Total Probability Formula and Bayes's Theorem 69.

H3 = {all the three articles belong to the second


batch}.
K(K-1) (K-2)
P(Ho)= (K+L)(l{ +L-1)(K +L-2)'
3K (K-1) L
=
p (Ilt) (K+L) (K+L-1) (K+L-2) '
3KL (L-1)
p (H2) = (K+L) (K+L-1) (K+L-2) '
L (L-1) (L-2)
p (H3) = (K+L) (K+L-1) (K+L-2) '
(N-n) (N-n-1) (N-n-2)
P(A!Ho) =1- N(N-1)(N -2) '
_ (N -n) (N -n-1) (M -m)
1 -- 1 -
P (AIH) N {N -1) M '
P(AlH)- 1 _(N-n)(M -m)(JlJ- m-1),
2 - NM (M-1)
p (AIH) = 1_ (.M -m) (M -m-1) (111-m-2) ,
s M(M-i)(M -2)
P (A)= P (H0) P (AIHo) + P (Ht) P (A!Ht)
+ P (H 2 ) P (A IH2) + P (Hs) P (A IHs)·
/
3~10:" There are two boxes of identical articles, some of which
are sound and the others are faulty. The first box contains a sound
articles and b faulty ones, and the second box contains c sound articles
and d faulty ones. An article is drm\ n from the first box at
random and put into the second box. After the articles in the
second box are stirred, an article is drawn at random from the second
box and put into the first box. Then an article is drawn from the first
box at random. Find the probability that the article is sound.
Solution. The hypotheses are
H1 = {the content of the articles in the first box bas not
changed};
ll2 = {one faulty article in the first box has been replaced
by a sound one};
H 3 = {one sound article in the first box has been replaced
by a faulty one}.

An event A ={a sound article is drawn from the first


box after the transfer}.
P(H)- a c+1 d+1
1
- a+b c+d+1
b
+ a+b c+d+1 '

p (H2 ) = a~b c+~+1 '


70 Applied Problems in Probabillty Th~orrf

a d
p (J/3) = a+b c+d+ 1
p (A)= ( a+b
a t-+ i
c+d+ 1
+ b
a-4--b

b c a+1
..J..... -~ - - - - - - - __...,;_
a+b c+d+t a t-b
a d a-1
+ a+b -~+d+t'" a+b
a {a+b) (c+d+ t)+bc--..ad _ a
~ (a+b} 2 {c+d+1) - a+b
be-ad
+ (a+b)a {c+d+t)
Tbts so lu t 1on sho\\ s that the pro ba b 1l1ll that the selected arttcle
1s ~ound does not change 1f the quant1t1es of sound and faulty artl
cles 1n the urns ate equal cia = dlb 1 e be . . . . . . ad = 0
3 11 T" o numbers are selected at random from the ser1es 1, 2, ,n
\\hat 1s the probability that the difference be tv~ cen the first and the
second chosen number IS no les~ than m (m > 0)?
Solutaon An e\ ent A conststs 1n the dtfference bet'' een the f1rst select
ed number k and the second number l being no less than m
(1 e k - l ~ m} The hypotheses are
H A= {the first selected number 1s ~} (A= m + 1, • 41, n).,
P (// 11 ) == 1/n P (AJH 11 ) = (k-m)/(n --1),

P(A) = ~ n ~;~1) = n (nt_1) 11 + Z+ ~ +(n-m)}


A.-m+t
(n-m) (n-m+i)
- .,;...._,.~~-~........;.....
~ 2n(n-i)

3 12 Four groups can be formed from N marks1nen a 1 excellent marks


men a 2 good ones, a 3 fa1r ones and a 4 poor marksmen A marksman
from the fth group may h1t a target In one shot 'v1th probabtltty Pi
( ~ = 1, 2, 3f 4) T'vo marksmen are selected at random to fire at the
same target Ftnd the probabiltt) that the target "tll be btt at least
once
Solution. An event A = {the target IS h1t at least once} The hypothe
ses are Ht = {a marksman from the tth group Is the first to be selected}
l = it 2, 3 4)
4

~ \ff -,\=- ~ , l1 '\A)-=- b ~ ~ \A11li ,}~


1-1

\Vhere P (A I Ht) may aga1n be found from the total probabtltt}


formula on four h} potheses concerning the marl\,sman ,vho ,vas the
Ch. 3. Total Probability Formula and Bayes's Theorem 71

second to be selected:
P(A!Hi)= ~=i [1--(1--pJ 2 1+ ~ 1\~!_ 1 [1-(1-pi)(1-pi)].
i=Pi
3. 13. A radar unit tracks a target which may use interference. If
the target does not employ interference, then during one surveillance cy-
cle the unit may detect it with probability p 0 ; if it employs interference,
then the unit may detect it with probability p 1 < Po· The probability
that the interference will be used in one cycle is p and does not depend
on the way and time the interference was used in the other cycles. Find
the probability that the target will be detected at least once in n sur-
veillance cycles.
Solution. The total probability that the target will be detected
during one cycle is (1 - p) Po +
pp 1 ; the probability of at least one
target detection in n cycles is 1 - [1 - (1 - p) Po- pp 1 1'1 •
3.14. A diagnostic apparatus is functioning, into which the results
of n analyses taken from a patient are fed. Each analysis (independent-
ly of the others) may prove to be erroneous with probability p. The
probability ff of a correct diagnosis is a nondecreasing function of the
number m of correct analyses: :f = cp (m). During the time ,; of func-
tioning of the apparatus diagnoses for k patients were made. Find
the probability of the event A = {an erroneous diagnosis was made
for a certain patient}.
Solution. \V e consider the event B = {an erroneous diagnosis was
made for a certain patient}. \·Ve advance hypotheses concerning the
number of correct analyses:
H 0 = {not a single correct analysis};
H 1 = {exactly one correct analysis};
• • • • 0 0 • 0 0 • 0 • • 0 • • • • 0 •

Hm = {exactly m correct analyses};


• • 0 • • • • • • • • • • • 0 • 0 • • •

H,. = { n correct analyses}.


The probability of the event H m for any m can he calculated bv
formula (2.0.1.5) (the theorem on repetition of trials) when p is re-
placed by 1 - p:
P (Ho) = p'\ P (H 1) = n (1 - p)pn-1, . .. ,
P (Hm) = C~~ (1- p)"'p"-"', ... , p (H11 ) = (1 -p)n.
By the total probability formula we have
11

p (B)= n~o C~ (1- p)m p"-mcp (m), P (A)= 1- [1-P (B)J".


1

~ ..15. Each article a factory produces may have a defect with b-


~blhty p .. Each article is checked by a sorter who mav detect a d p{o t
lf any, With probability P 1 or may not detect it ;Yith pr;ba~ir~Y
72 Apr;Ued Problems in Probabfllty Th~ory

1- p 1 In ad dt tton, the sorter rna y m al\.c a m 1s ta. ke and reJect a sound


arttcle this occurs w1th proh1bll1ty p 2 The sorter can check N articles
dur1ng'one sh1ft F1nd the probability R that at least one art1cle \vlll be
assessed Incorrectly t 1 e a sound one reJected as defecttve or a defecttve
one passed as sound (the results of the checks of the art tcles are 1n
dependent).
Solut1on. The hypotheses are
H 1 = {an article has a defec· t},
H 2 = {an art1cle does not have a defect}.
By the total probab1l1ty formula, the probl&btllty of one art1cle
be1ng asse~~ed Incorrectly 1s

""'P == P (1 - P1) + (1 - p) p,.


The probahlllty that at least one article 1s asqessed Incorrectly IS

R = 1 - (1- p)N
3 16 There are a students obtatning ue1:celle11t'' markst b students
(),b\.a\\\\"ng "'g()(}du matks a~d e ~tud~nt~ <3h~~\\\.\\\g "(a.~-..::" mark% 1.n. a ~tOll"Q
In a forth comtng e'tam1nat1on, students that usually get e1:cellent
marks rece1ve only ''e"\cellent" marks, those \vllo usually get ~'good"'
marks may recetve either ue"\.cellent" or "'goodu m!\r:ks 'vtlh equal prob
ab 1l1 ttes, and tho~e who usua 11 y get "Iatr" marks may rece1 ve "good".
"fatr', or "pooru marks " 1th equal prob ab Ill t1es At the e~amtnat1on
a student IS selected at random F1nd the pro.bab1l1ty of the event
A = {the student rece1ves a "good" or an ue'tcellent" mark}
Solut1on. The hypotheses are
H1 = {a student 'vho usually gets ue"'\cellent" marks lS
selected},
H ~ = {a student 'vho usua II y gets f,'good~' mtlr ls 1s selected},.
H a = {a student 'vho usually gets "fa1r't marl"s 1s selected}
p (H ) _ a p (H _ b _ c
t ~ a+b+c ' 2)- a+b+c ' p (1/a)- a+b+c ·
p (A)= p (//,) 1 + p (Il2) 1 + p (/Is){= rlJ~"t.;.~a
3 .. 17. The cond1 ttons are the same as 1n the preceding problem,
but three students are selected at random Ftntl the probabtlt ty that
they 'vtll recetve an "e'\:cellent'\ a "goodu • and ~ "fatru mark (1n any
sequ-ence).
Solution. An event A = {an "excellent",, a ngood., and a "fatrn mark
are obtruned} 1s posstble only for the follo,vtng hypotheses·
H1 = {one 5tudent 'vho usually obtains "f.31r" marks, one
student \Vho usually obtains "good"' roarks and one student
'vho usually obtatns ''excellent" marls ttre selected),
Ch. 3. Tot al Pro bab ilit y Fo rm ula and Ba yes 's Th eor em 73

H2 = {o ne stu de nt wh o us ua lly ob tai ns "fa ir" ma rks an d


tw o stu de nts wh o us ua lly ob tai n "go od " ma rks are sel ec ted };

H 3 = {two stu de nts who us ua lly ob tai n "fa ir" ma rks ·


an d one stu de nt who us ua lly ob tai ns "good" ma rks are
sel ect ed };
H 4 = {two stu de nts who us ua lly ob tai n "fa ir" ma rks an d
one stu de nt who us ua lly ob tai ns "ex ce lle nt" ma rks
are sel ect ed }.
Babe 3b (b -1 ) c
P( Hi )= N( N- 1) (N -2 )' P( Hz )= N (N -1 )( N -2 )'
3b c(c -1) 3a c(c -1)
p (Ha) = N (N -1 ) (N -2 ) ' p (H4) = N (N -1 ) (N -2 ) '
N= a+ b- f-c .
11 21 12 p 2
P (A )= P (H 1)·1 · 2 · s+ P (H2 ) ;; ·s + P (H3 )- y· g- + (H4 )·1 ·-g .

Ea ch of the n pa sse ng ers sit tin g in a bu s ma y ali gh t fro m it at


3.1 8.
pro ba bil ity p. Mo reo ve r, at the ne xt sto p eit he r no
the ne xt sto p wi th s
ali gh t wi th pro ba bil ity Po or on e pa sse ng er bo ard s the bu
passengers
pro ba bil ity 1 - p Fin d the pro ba bil ity th at wh en the bu s co n-
wi th 0 •
it.
y aft er the sto p, the re wi ll ag ain be n pa sse ng ers in
tin ue s on his wa
. An ev en t A = {th ere are ag ain n pa sse ng ers in the bu s
So lut ion
aft er the sto p} .
Th e hy po the ses are
H 0 = { no passengers bo ard the bu s};
H 1 = {o ne pa sse ng er bo ard s the bu s}.
P (H o) =P o, P (H l) = 1 - Po , P (A I Ho ) = (1 - p)n ,
P (A I Hi ) = np (1 - p)n -t, P (A) =P o (1 - p)n
+ (1 - p 0 ) np (1 - p)n -l.
ins tru me nt co nsi sts of tw o un its I an d II wh ich rep ea t ea ch
3.1 9. An
(Fi g. 3.1 9) an d can op era te at ran do m un de r on e of the tw o set s of
oth er s
ou rab le or un fav ou rab le. Un de r fav ou rab le co nd iti on
co nd itio ns: fav
ty of eac h un it is p an d un de r the un fav ou rab le co nd iti on s
the rel iab ili 1

it is p 2 • Th e pro ba bil ity tha t the ins tru me nt wi ll


op era te un de r fav ou rab le co nd itio ns is P 1 an d un de r I
the un fav ou rab le co nd itio ns '1 - P . Fin d the tot al
1
(average) rel iab ili ty P of the ins tru me nt.
Answer.
H
P = Pi [1 - (1 - Pi) 2
+
l (1 - P 1 ) U - (1 - p 2 ) 2 ].
are ins tru me nts of the sam e kin d on a Fig . 3.1 9
3.2 0. Th ere
shelf, m;nong \Yhich the re are a new ins tru me nts an d
b use d ms tru me nts (a ~ 2, b~ 2). Tw o ins tru me nts are sel ec ted at
d use d for som e tim e. Th en the y are rep lac ed on to the sh elf .
ran do m an
74 A ppUerf Proble mt in Pr<1 bab dt~ q Theot'q

Then ar~atn o In~trumcnts aro selected at random Ftnd the probab1hty


t\\
of the a e"ent A = {both Instruments ~elected the second time
are ne\v}
Solut1on The hypothe"es are
/11 = {both •nstrument~ selected the first t1me ''ere
ne\\}
/ / = {both 1 nstrumen ts selected the first t tme had been
2
used}
/ / 3 = {one of the tnslrumcn ts selected the first ttme "as
ne\\ and the other had been used}
a {tt-l) { b {b-1)
(a+ b) {a+b- t) p (l 2) = (a+"b) (a+b-1)
2..ab
p (l/3) --- (a b) (a --!- b - I ) '

l
l {A ___ a (a -1} (a~2) (a- 3) + b (b-1) a (a-1) + 2ab (a---t) (a-2).
) --- (a+ b):l (a+ b - 1)1
J 21 A message can be ~cnt over each of n communtcatlon channels
~Bc.h. chau.nel posse~~\ug d\fferant -pro-pertl(~s (at b-e1ng 1n ddterent
cond1t1ons), n 1 of the channels are 1n excellent cond1t1on n 2 channels
are In good condttron n 3 channels are •n fa1r condttton and n.~ are 1n
poor cond1t1on (n1 n'! + +
n 3 + n 4 ---.. n} Tbe probablltty of a mes
~age be tng tra nsm1 t t rd correct I y IS eq ua 1 to p 1 p 2 p 3 and p 4 for dt fierent
channels tespectt"\ely To tucrease the rel\ "lhtllty of the traffic, each.
me~sage Is repeated t\\ ICC overt\\ o d1flerent channels w1Itch are selected
at r1.ndom F•nd the probabtllt) that 1t \VI11 he transmitted correctll at
least o' £!r one channel
SolutJon A ;;:::: {at least one message IS transmit ted correctly) \Ve
ad' a nee four h l pot heses concerning a group to '" h1ch the channel o'er
'' JHch the firs~ of the t\\ o messages rs ~ent belongs !It /!2. H 3 /!4
"here H, = {the flrst me'3sage IS sent over a channel 1n the lth group},
P (H,} =::;. nlfn (i = 1 2 3 4) B1 the total probablltt'~ formula
4
P (.4) == ~ P (/!1) P (A lilt)
i..-1

On the hypothesis H 1 the condttional probab1l1t' of the e' ent A ts

P (A !ll1) = ~ ::/ [1- (1 - Pt) 2 J---.--. ~ nr:!_ 1 11- (1- Pt) ( 1- PJ)l
J~i

3 22 An Instrument cons1sts of tv. o un1ts I and II (F1g 3 22), both


of '' htch must be faul tle~s for the tns trumen t to operate t and a val tage
stabt l1zer S \Vhtch can be e 1ther sound or faulty \\ hen the s tab• ltzer
IS sound, the rellabiltties of un1ts I and II are p 1 and p 2 respect 1vely \\hen
the stab1lrzer IS faulty, tbey are p~ and p., The stabilizer 1s sound \VIth
) probability Ps F1nd the total reltabJ.ltty P of the Instrument
Ch. 3. Total Probabi lity Formula and Bayes's Theorem 75

Answer.
P = PsPtPz + (1 - Ps)p~p;.
3.23. The conditi ons are the same as in Proble m 3.22 but units I
and II repeat each other (Fig. 3.23).
Answer.
P = Ps (1 - (1 - P1) (1 - P2)l + (1 - Ps) ('1 - (1 - P~)
X (1 - p~)].
3.24. There are n test papers, each of which contain s two questio ns.
A studen t does not know the answers to all the 2n questio ns but only
knows those to k < 'l.n questio ns. Find the probab ility p that he will

II
'
'L
s ~s'

Fig. 3.22 Fig. 3.23

pass the test if he either answers both questio ns in one test paper chosen
at random or one questio n in his paper and one questio n (which the
examin er chooses) in another paper.
Solutio n. The hypotheses are
H 1 = {the studen t knows the answers to both questio ns
on his paper};
H 2 = {the studen t knows the answer to one questio n
out of the two questio ns on his paper}.
_ k (k-1) i+ 2k (2n-k) k-1
P- 211 (2n-1) • 2n (2n-1) 2n-2 •
3.25*. An artiller y target may be either at point I with probab ility p 1
or at point li with probab ility p 2 = 1 - p 1 (p 1 > 1/2). We have n
"hells, each of which can he fued at either point I or point II. Each shell
may hilt he target, indepe ndently of the other shells, with probab ility p.
How many shells n 1 must be fired at point I to hit the target with
a maxim um probab ility?
Solutio n. The event A = {the target is hit when n 1 shells are fired
al point I}. The hypoth eses are
H 1 = {the target is at point I}; P {H1 ) = p 1 ;
H 2 = {the target is at point II}; P (H 2 ) = 1 - p 1 •
1) (.4) = Pt [1 - (1- p)n, 1 + (1- p 1 ) [1- (1 - pr-r•].
~6 Ap pl eel Pr ob l!m ! tn Probab l ty Theory

Constdering P (A) to be a funct1on of th e co nt 1n


we find th at
uo us argument n,
_d?_(A) = (- - P t( 1 - p) n• -r -( 1- Pt) (1 - p) n nl ]J n (1 - p)
dn 1

d'l:n~A) = - fPt (1 - p)nt + (1 - Pt )( 1 - Pt ntJlnZ (1 - p) < 0


an d se e th at th e fu nc tio n pos~es"es a s1ngle
m a' tim um at th e potnt
;;J =~2+In l - P tj l2 ln
Pt
(1 - p)J '\h er e dP (A)/dn1 = 0
~

N ot e th at n 1 > n/2 for p 1 > 1/ 2


,..,
If th e nu m be r obta1ned n ~ n an d IS an 1n te ge
1 r, th en 1t 1s the re-
qu ire d nu m be r tf tt ts no t an tn te ge r (but~ n)
th en w e ha ve to calcu
la te P (A ) for tb e t'' o ne ar es t In te ge rs an d choo
se the 1n te ge r for "h1ch
P (A) IS gr ea te r If th e n um be l n""" 1s gr ca te r th an
1 n th en all n sh el ls mU5t
be fired at po 1n t I (tht<) ha pp en s for In {(1 -
p 1)/ p 11~ n In (1 - p)
1e p 1 > l1 + (1 -p Y ') 1
3 26 A pl an et s la nd in g If th e ''e at he r IS fa"\ ou r ab
th e pl an e ca n see th e ru n\ \ ay In th1s ca~e th le th e pi lo t land1ng
e probnb1l1ty of a safe
1and 1ng 1s p 1 If th er e 1s a lo\v cloud ce llt ng th e pt
lo
la nd tn g by In st ru m en ts Th e rel1ab1l1ty (th e pr ob t ba s to m ak e a bltnd
ab tl1 ty of fatlure free
funct1on1ng) of th e In st ru m en ts needed fo r a
bl tn d la nd tn g lS P If
th e bl1nd la nd 1n g In st ru m en ts fu nc tio n no rm
al ly th e pl an e makes
a sa fe la nd in g '' 1th th e sa m e pr ob ab tlt ty p as
1
land1ng If th e bl in d la nd in g 1n st ru m en ts fa ll 1n th e case of a vtsual
th en th e pd ot m ay make
a sa fe 1an dt ng "1 th pr ob ab 1l1 t y p* < p
1
F1nd th e to ta l pr ob ab Ill t) of a sa fe la nd tn
g If 1t 1s kn ow n th at
1n k pe r ce nt of ca~es th er e 1s a lo w cl ou d ce ilt
ng
So lu tto n A = {a sa fe la nd tn g} Th e hy po th es
es ar e
111 = {the cl ou d ce llt ng IS h1gh}
H 2 = {t he cl ou d ce ilt ng ts lo\v}
p (Hl) = 1 - k/ 10 0, p (H I) = }./100, p (A
I H l) = Pt
P (A I H 2 ) ca n al so be fo un d by th e to ta l pr
ob ab 1l 1t y fo rm ul a
P (A I H t) = P P s+ (t - P) Pt 10
P (A )= ( 1 - ~ ) Pt
+ t::O lPPt + (1 - P ) Pt l
3 27 Th e first of th re e ur ns conta1ns a w hi te
an d b bl ac k ba llq th-e
second co nt ai ns c w ht te an d d bl ac k ha lls an d th
e th tr d co nt ai ns k 'vb1te
ba lls (a nd no bl ac k ba lls ) A ce rt ai n Pe tr ov ch
ooses an ur n atJrandom
an d draws a ba ll from 1t Th e ba ll IS \vh1te Ft nd
th e ba ll came from (a) th o first (b) th e se co nd th e pr ob ab l1 tt1 es th at
or (c) th e th Ir d ur n
Ch. 3. Total Proba bility Formu la and Bayes 's Theor em 77

Solut ion. We use Baye s's formula (3.0.2) to solve the probl em. The
hypotheses are
H 1 = {the first urn is chosen};
H 2 = {the second urn is chosen};
H 3 = {the third urn is chosen}.
A priori (before the exper iment ) each hypo thesis is equal ly possible:
i.e. P (H1 ) = P (H 2 ) = P (H 3 ) = 1/3.
The event A = {a white ball is drawn} occurred. We find the con-
dition al proba bilitie s:
P (A I H1 ) = al(a +b), P (A I H 2 ) = cl(c +d), P (A I H3) = 1.
By Bayes's formula, the a posteriori proba bility that the ball was
drawn from the first urn is
, 3

p (Ht I A)= + p (A I H1) I[+~ p i=i


(A I Hi) J

Simil arly
p (H 2 I A)= ( c~d ) I (a~b + c~d + 1 )'
P(H3 I A)= ( a~b + c~a + 1) -t.
3.28. An instru ment consists of two units . Each unit must funct ion
for the instru ment to opera te. The reliab ility (the proba bility of failur e-
free performance for time t) of the first unit is p 1 and that of the second
unit is p 2 • The instru ment is tested for a time t and fails. Find the
proba bility that only the first unit failed and the second unit is sound .
Solution. Four hypotheses are possible before the exper imen t:
H 0 = {both units are sound};
H 1 = {the first unit failed and the second one is sound};
H2 = {the first unit is sound and the second one failed};
H 3 = {both units failed}.
The proba bilitie s of the hypotheses are

p (Ho) = P1P2• P (Hl) = (1 - P1)P2• P (H2) = P1 (1 - p 2 ),


P (Ha) = (1 - P1) (1 - P2).
The event A = {the instru ment failed} occurred:
P (A I Ho) = 0, P (A I H 1) = P (A I H 2) = P (A I H 3) = 1.
78 Applfed Problems tn ProbabilitJ Theory

By B1} es s formula '' e have


p (II I A) ~ {1 - p~) P"J :=:::: (t- Pt) p~
1 ' - { l - P1) p., +p 1 { 1- P2l + (1- Pt) t1- PlZ) 1- PtP•
3 29 As for Problcm3 26 e""tcept that lt IS kno,vn that the plane land
ed safely F1nd the probabJl1ty th 1t the p1lot used the bl1nd landtng
1nstrumen ts
Solutton If the ptlot landed bl1nd, then the cloud ceJling ''as low
(h~ pothes1s H '1J From Problem 3 26 '"e f1nd th1.t

p (II 2 ~ A} =- k
"
100 (Ppl+(t-P)~)
k
( 1- toO ) Pt+ tOO lPPt +<1-P) P~J
3 30 ~ f sherman has three favour1te places for be may f1~h1ng and
'1s1t each \VIth equal probabrl1ty If he casts a l1ne at the first place
the fish tn 1 y b1 te " 1th pro bab1l1 t y p 1 at the second place WIth proba
bll1ty Pt and at the third place \VIth probabtli ty p 3 It IS known that
the fisher1nan ca" t a ltne for thr~e t 1mes and that he got on I y one h1te
F1nd the prohabi1I t~ that he f1~hed at the first place
\ns,,er
3
P {III J A)= IP1 (1- Pt)2 ) ~ p, {1-p~r~
i j

3 3\ Eat.h at\ltle a factory manutac.tures ma) ha"\ e- n defect "\\lth


probabiht~ p Each article 1s checked on the assembly line bJ one of
t\~ o sorters\\ l\h equal probah1l1.t:~ The ftrst ~orter detects n defect v.lth
prob1bll•ty p 1 and the second '' rth probahilitl Pz If an art1cle 1s not
reJected on the asQem bl y l1ne 1l pa~ses tQ the qua llty control de-partmeut
"here defects provided that they ex 1st are detected \\ tth probab1ltt~ PJ
An arttcle ts kno\\ n to be reJected F1nd the ptobah1l1ty that 1t ~as re
Jected (1) by the first c:orter (2) by the second sorter or (3) by the quahty
control department
Soluhon Defore the e~per1ment !our hypothe~es are possible
Ho = {the &ttlcle 1s not reJected}
H 1 :-,;;;;::= {the arttclo 1s reJected b) the f1rst sorter}
11 2 = {the 1rttcle 1s reJected by the second sorter}
H 3 = {the art1cle Is reJecl.e(l by the qual1ty control
department)
The e'en t A = {the arttcle 1s reJeCt eLl} occurred
the hypothes1s H 0 sJnce P (A l//0 ) ~ 0 J
e do not need 'r
p (Hl) = PP1!2 P (H2) = PP212 P (H3) ~ p ll - (PI + P2)/2]p9
After tne e~per1ment. the })rubabll1t1es of the hypotheses are

p (l!, r A)=- lPa+(Pd~~zH~-p~) p (l/'1. I A)= 2Pa+(Pc:~2) {1-pl)


P (H 3 1A)~ P3 (2 - (pr + P2)]/(2p + (pl + P2) (1 -
3 P3)]
Ch. 3. Tota l Pro bab ility For mul a and Bay es's The ore m 79

3.3 2. Three out of ten stu den ts tak ing a tes t are exc elle ntly pre -
par ed, fou r are wel l pre par ed, two are ade qua tely pre par ed and one is poo r-
ly prepared. There are 20 que stio ns on the tes t pap er. A stu den t who wa s
exc elle ntly pre par ed can answer all 20 que stio ns, a stu den t who was we ll
pre par ed can ans wer 16 que stio ns, a stu den t wh o was ade qua tely pre -
pared can answer 10 que stio ns and the stu den t who was poo rly pre -
pared can answer onl y five que stio ns. A stu~ent w~s cal led at ran ??m
ans wer ed thre e arb itra rily cho sen que stio ns. Fm d the pro bab~hty
and
tha t the stu den t was (1) exc elle ntly pre par ed, and (2) poo rly pre par ed.
Sol utio n. The hypotheses are
H 1 = { the stu den t was exc elle ntly pre par ed} ;
H 2 = {th e stu den t was we ll pre par ed} ;
H 3 = {the stu den t was ade qua tely pre par ed} ;
H 4 = {the stu den t was poorly pre par ed} .
A prio ri: P (H1) = 0.3 , P (H 2) = 0.4 , P (H 3 ) = 0.2 , P (H 4 ) = 0.1 .
The eve nt A = {the stu den t answered thr ee que stio ns} . Pj(A I H 1)=1 .
16 15 14
P (A I Hz )= 20 . 19 . 18 ~ 0.4 91'
10 9 8
p (A I Ha) = 20 . 19 . 18 ~ 0.1 05,
5 4 3
p (A I H 4) = 20 • 19 . 18 ~ 0.009.
A posteriori:

(i) P (H 1
I A) = o.3 ·1+ 0.4 ·D. 491 !·~ :i. o.1 05+ 0.1 -o. oog ~ 0 · 58 •
(2) p (H 4 I A) = 0.1 · 0.0 09/ 0.5 18 ~ 0.002.
~.33 .. A rad ar ':Il_it picks up a mix tur e of a leg itim ate sig nal and
nOise. ':·It h pr~babrhty p, an~ noise alone wit h pro bab ilit y 1 - p. If
a legrtrmate. Signal _arnves With noise, the n the uni t reg iste rs the pre s-
ence of a Signal With pro bab ilit y p 1. If noise alone arri ves the n the
uni t registers the presence of a sig nal wit h pro bab ilit y p 2 • The uni t is
known to have reg iste red the presence of a sig nal . Fin d the pro bab ilit y
tha t there was a leg itim ate sig nal .
Answer.
PP tl[p pl + (1 - p)p 2l.
3.34. A passeng?r. can buy a tra in tick et at one of thr ee boo kin g-
offi~es. The .pro bab ihty tha t he buy s a tick et at a booking-office dep end s-
o~ Its loc atw n; the pro bab ilit ies for the thr ee are p p and p respec
trYely. The ~rob~bilily tha t all the tick ets are sol d by the tim e the
passenger arrrves IS equ al to P 1 for the first booking-office to p for th
second, ~nd to P3 for the thir d. The passenger buy s a tick et'a t a booking~
office. Fm d the pro bab ilit y tha t it is the first one.
SO App l ed Problems ln Pro~ab llty fhe ory

Solut1on
P (H 1) = P1 'P {I-I,) - Pt P (H 3) Pa -=;; A = \he bn ls ~ t\t~~)
P( Ar iit )= 1-- Pt P{ Af ll, )= 1- P2 p (A I 93) = 1.- P3
p (/l1 fA )- PJ (i.- Pl\
- p 1 (I- P1 )+P 2 (f- P:) +P 3 (t- P, )

3 3J A mts slle 1s fired at a pla ne on wh u:h the re are t\vo targets ·


and II (Fig 3 35) The pro bab ilit y of htt ttn g tar get I IS p 1 aod that D
hi tttn g tar get 11 1s p 1 It 1s kno wn tha t targ et I wa
not h1t F1nd the pro bab 1llt y tha t tar get Il was h1t
Solut1on The hyp oth ese s are
HL == {ta rge t I 1s htt )
H2 = {ta rge t II lS htt }
!1 3 = {ne 1tb er tar get ts h1t}
The eve nt A = {ta rge t I 1s not h1t}
P (lit )= P1 P CH2)- P2 P {H3) = 1- (P t P2) +
P( AI H) =O P(Af//2 )~t P( Ai ll,) == i
B} Bay es s for mu la we ha' e
P (Ha l A) = p 2 1(p 2 + 1 - (p1 + p 2 )] = p'JI(1 - P1)
The pro ble m can also be sol ved Without res ort to Ba} es s formula
P (H 1 t A) = P (H 2 A)I P (A) - P (//2 )/P (A) =
Pal (i - P1)

3 36 Sig nal s are sent With pro bab ll1t 1es Q1 Q2 Q3 under one of three
set s of con diti ons R 1 R 2 R 3 und er eac h set of cond1t1ons a s1gnal may
rea ch the des t1n at1 on und 1st orte d by Inte rfer enc e w1th pro bab ilit ies P1
Pt. and p3 res pec tive ly Thr ee stg nal s \\er e sen t und er one set of condtttons
(1t IS not kno " n \Vhtch) and one of the s1g nal s \vas dis tor ted F1nd tbe
a posteriOri pro bab ilit ies tha t the s1gnals \Vere sen t und er the first, tbe
sec ond and the th1 rd set of can drt1 ons
Sol utio n The hyp oth ese s are
II - {co ndi tion s R 1 }
H 2 - {co ndi tion s R 2 }
H3 = {condtt1ons Ra}
The a prJ ori pro bab JIIt tes are P (H1) = Q1 P (H,J =Q 2
P (H3) ===
Qa
The eve nt A - {one S1gnal1s d1s tort ed and the oth er two are not}
P( Aj/ 1 1)= 3p t(1 -p 1) (l= 1 2 3)
3
P{H,,A)=Q 1 p~(1-p 1)121 Q1p} (1 -p1) (J= 1 2 3)
,._.f
Ck. 3. Total Probability Formula and Bayes 's Theorem 81

3.37. The cause of a fatal air crash is being inves tigate d, and four
hypotheses are possible: Hh H 2 , H 3 , H 4 • Stati stica lly P (H1 ) = 0.2,
P (H 2 ) = 0.4, P (H 3) = 0.3 and P (H4 ) = 0.1. The event 1 :=
{fuel
ignition} was found to have occurred. By the same statis tics, the
cond itiona lprob abilit iesfo r the event A, given the hypo these s H 1 , H 2 ,
H 3 , H 4 , are P (A I H 1 ) = 0.9, P (A I H 2 ) = 0, P (A I Ha) = 0.2 and
P (A\ H 4) = 0.'3. Find the a poste riori proba biliti es for the hypo these s.
Answer.
P (H1 I A) = 2/3, P (H 2 I A) = 0, P (Ha I A) = 2/9, P (H4 I A)= 1/9.
3.38. A targe t being track ed may be in one of two states : H 1 and H 2 •
The a prior i proba biliti es of these events are P (H1 ) = 0.3 and P (H2 ) =
= 0.7. Two sources of infor matio n yield contr adict ory infor matio n
concerning the state of the targe t. The first source repor ts that the targe t
is in state H 1 and the second repor ts that it is in state H 2 • In gener al,
the first source yields correct infor matio n about the state of the targe t
in 90 per cent of cases and is erroneous in only 10 per cent of cases. The
second source is less reliab le, it yields correct infor matio n in 70 per
cent of cases and is erroneous in 30 per cent of cases. Analy sing the
information received, find the new (a posteriori) proba biliti es of state s
H 1 and H 2 •
Solution. The event A = {the first source repor ts H 1 and the second·
reports H 2 }. On the hypotheses H 1 and H 2 the cond itiona l proba biliti es
of this event are
P (A I H 1) = P {the first source sent correct infor matio n
and the second was erroneous} = 0.9·0 .3 = 0.27,

P (A 'H 2) = P {the fi.rst source was erroneous and the second sent
correct infor matio n}= 0.1·0 .7 = 0.07.
By Bayes's formula we have

P (H1 I A)= o.s.o~2~·~~:7.0.07 ~ 0.623,


P(H2 1A)= 1-P (H1 1A)~ 0.377.
3.39. Unde r the conditions of the preceding problem there are three
~qually reliab le sources of inform ation which yield correct infor matio n
m 70 per cent of cases and are erroneous in 30 per cent of cases. Two
~o?r~es repor t th~t the targe t i~ iD: state H 1 and one source repor ts that
1t 1sms~ateH 2 • Fmd the a poste rwnp robab ilitie s of the states H 1 and H •
2
~oluhon. A = {the first and the second sources repor t H 1 and the
th1rd reports H 2 }.
p (A I Hi)= 0. 7·0.7 ·0.3= 0.147
P (A I H 2) = 0.3·0 .3·0.7 = 0.063
P (Ht I A)= o. 3 • 0 .~4~.!~:;.o.oB 3 = 0.5, P (H2 I A)= 0.5,
6-0575
82 Ap plted Probltml In Pro bablllty Theory

1 e after the expertmen t the hypotheses H 1 and H 1 are equally pos


s1ble
3 40 Before the experlment, 1t 1s poss1ble to make n mutually e'telu
Sl\e hypotheses 'vh1ch form a complete group H1 H 1 , , Hrl, each
hypothesis bav1ng an a prtorl probability P (/11) .. P (H ,Jr , P (Hn)
It became clear as a result of the exper1ment that one of the hypotheses
from the group H1 H" \vas the true one and the other hypotheses \Vere
1mposstble 1 B H1 + H 2 + + H'k = g and H,+t + + Hn =::
'& F1nd the a poster1or1 probabtllttes of the hypotheses
k
Solution The event obser\ed 1n the experiment A = iJ Ht
1.-t
By
Bayes s formula ~e have
k
p (HI I A)== p (//.)1'2] p (11 ,) (t = 1, , k)
i~1

1 e ~1 Lt becomes apparent as a result of the experlment that only some of


the hypotheses H1 H Jt are possl ble and the others are not, then to
obtatn the a postertort probabtltttes we must d~vtde each of the a pr~orJ
probab~llttes P (H1 ) P (H 2 ) P (H,) by the~r ~um
3 4! An Instrument cons1St1ng oft" o un1ts 1 a"1d II ts be1ng test
ed The reliabiltttes (the probabtl1t1es of fa1lure free performance
durtng ttme "t) of untts I and II are knO\VD to be p 1 = 0 8 and p 2 = 0 9
The un1ts may Ia11 J.ndependently of each other After the elapse of
t1me 't 1t was found tbat the 1nstrument 1\ as not sound Tak1ng thts fact
tnto account find the probab1ltt1es of the hypotheses
H 1 ==- {only the first untt t.S faulty},
H a = {only the second unit 1s faulty},
H 3 = {both units are faulty}
Solutton Four hypotheses rather than three were possible beforelthe
experiment, 1nclud1ng H 0 = {both un1ts are sound} The exper1ment
shows that one of the hypotheses H 1 H 2 H 3 holds true A = H 1 +
H1 + H 3 The a pr10r1 probabilities of these hypotheses are
P(Ht)=O 2 0 9=0 18, P(H2 )=0 8 0 1=0 08.
3
P (H s) = 0 2 0 1 == 0 02, ~ P (H 1) - 0 28
1=1

The a poster1or1 probabtlities are


P (ll1 \A)~ 0 18/0 18~ 0 643 P (H, t A) ~ 0 286 and
P(H 8 l A) ~ 0 071
3 iJ2 The Instrument whose cbaracterist1cs are g1V"en In Problem 3 41
ts tested dur•ng ttme 't and 1s found to be faulty To locate the fault,
the 1nstrumen t ts put through three tests T 1 , T 2 and T 3 The first two
tests gtve pos1t1ve results and the th1-rd test g1ves a negat1ve resultt
i e the event B = { + + -}
occurs, where ' + Slgnll1es a pos1t1ve
Ch. S. Total Probabi lity Formuz a·and·B ayes's Theorem 83-

result and "-'' signifies a negativ e result. The conditi onal probab ilities
of the positiv e result for each test Tv T 2 , T 3 , given ~~e hypoth eses
H 1 , H 2 , H 3 , are known ; we designate them as Pi.J• where L 1s the numbe r
of the test and j is the numbe r of the hypothes1s: Pn = 0.4, P12 = 0.6.
p 13 = 0.9, p 21 = 0.5, Pz2 = 0.4, P23 = 0.6, P3t = 0.7, P32 = 0.6
and p 33 = 0.3. The results of the. tests are indepe ndent. ,
Indicat e the most probab le of the possible states of the instrum ent, fol'
which purpose find the a posteri ori probab ilities of the hyp~theses, give.n
that the experim ent yielded events_A and B (A = {the mstrum ent IS
faulty }= H 1 + + H2 Hs).
Solution. We take the data of the precedi ng problem as the a priori
probab ilities to estima te the results of the tests:
P (H1 I A) = 0.643, P (H 2 I A) = 0.286 and P (H 3 I A) = 0.071.
We calcula te the conditi onal probab ilities of the event B given these-

hypotheses: .
P (B I H1 ) = 0.4·0.6 ·0:1 = 0.024, P (B I H 2 ) = 0.5·0.6 ·0.4 = 0.120,,
'
P (B I H 3 ) = 0.7 ·0.6·0. 7 = 0.294.
By Bayes's formula we have
0.643·0.024
p (Hi I A·B) = 0.643·0. 024+0.2 86·0.120 +0.071· 0.294 ~ 0 ·298 •
P (H2 1 A·B) ~ 0.662, P (H3 1 A ·B)~ 0.040.
The most probab le state of the instrum ent is H 2 ·= {only th'Ol second
unit failed}.
3.43. A collect ion of n parts, any numbe r of which may be defective,
is chosen to manufa cture an instrum ent. The parts are supplie d by tw(}
factories I and II. The statisti cs show that the articles produc ed by fac-
tory I may have a defect with probab ility p 1 and those by factory II
with p 2 • The instrum ent is assembled from parts produced by the same·
factory. In the laborat ory where the instrum ent is being assembed
there are three boxes of parts, two of which contain parts from factory I
and one contain s parts from factory II. The box from which the articles
are taken is selecte d at random . The assembled instrum ent is tested.
If no less than m of the n parts are found to be defecti ve, the instrum ent
is rejected and it is reporte d as unsatis factory to the manufa cturer.
The instrum ent was rejecte d by the quality control departm ent. Find
the probab ility that its conditi on is reporte d as unsatis factory to fac-
torv• I.
Solution. The hypotheses are
H 1 = {the instrum ent was assembled from parts
manufa ctured by factory I};
H 2 = {the instrum ent was assembled from parts
manufa ctured by factory II}.

The a priori probab ilities are P (H 1) = 2/3 and P (H2 ) = 1/3.
6•
84 Appl ed Problt!mt In Probabtlity Theory

The event A = {the Instrument "as reJected} = {no less than m


art1clps out of the n. ''ere defect1' e) o~curred
T1 n
P{A Jilt)- 2J
{-tn.
P1(1-Pt)n-i p (A I J/2) = 1J
t=m
pJ (1- P2)~ 1
By Da3 es's formula the a posletJOrl prohnhJlltJes a.re
n

f _b P) {i-pl)11-l
p (HI l A) = __n_ _ _..:_i-....;,.m;,:...__ _ _n.....,_...__ _ _ '

f h Pi (t- PI}n '+ f h Pi (t- P1)n-t


i-m
P (H2 l A)= 1- P (fit l A)
3 44 There nre t\Vo boxes 'v1th parts of the same type The first con
talD.3 a sound parts and b defecttve ones and the second contatns c sound
parts and d defect1ve ones A box IS selected at random and one pare
IS drawn from 1t and 1s found to be sound F1nd the probab1lity that the
next part to be drawn from the box 'v1ll also be ,ound
Solut1on The hypotheses are
H 1 = {the fust box was selected}
H 2 = {the second box ''as $elected}
The event A = (a sound part was drawn first}

P(Ht)=P(Rt)=f/2. P(Ht I A)= ( a~b )/( c~b + c~a) •


p (lfz I A).,. { e~d ) { { a~b + c~d )
The event B = {a sound part 'vas drawc second)
P (B J A) = P (H1 J A) P (B t H 1 A) + P (H. J A) P (B J H 1 A)
G1ven that tha first box was selected and a sound part was drawn from
1t, the condttlonal probah1l1ty of dra\Vlng a second sound part IS
P (B t H1A) = (a - 1)/(a + b -- 1),
stmilarly P (B \ H 2 A) - (e- 1)/(c + d- 1)
lienee the requued prohabJ]Ity 1s
p (B 11 A)
=-
{ a.
a+b
+ c ) ...
c+d
1[ a (a-1)
(o+b) (a+b-1>
+ {e+d), (.:--1)
(c+d-i}
J
3 45 There are three comrnunicat1on channels over whtch messages
are sent at random (wrth equaJ probahtltty over each channel) The
probab1l1ty of a message bc1ng dtstorted when 1t IS sent over the first
channel zs p 1 over the .second channel, p 2 over the th1rd cbannelt p 3
A channellS selected and J... messages are sent over 1t, none of which are
Ch. 3. Total Probability Formula and Bayes's Theorem 85

distorted. Find the probabilit y that the (k +


1)th message, sent over
the same channel, will not be distorted.
Solution. The hypotheses are
H 1 = {the messages were sent over the first channel};
H 2 = {the messages were sent over the second channel};
H 3 = {the messages were sent over the third channel}.
P (A I H 1) = (1- P 1)k, P (A I H 2) = (1- P2)k,P (AI Hs) = (1- P 3)k
1/3(1-Pi)k
P(Hi lA)= 1!3[(1-Pr) k+(i-p2)k+ (1.-pa)kJ

= (1- Pr) 11 + (1-Pi)k


(1- Pz)k +(1- P3) k (i=1, 2, 3).

The event A = {k messages are not distorted} and B= {the (k+1)th


message is not distorted}.
p (B I A)- (1-pr)k+1 +(1-p2)k+ l+(1- Pa)k+l
- (1-p 1 )k+(1-p~)k+(1-p3 )k •
3.46. There are m batches of articles with N 1 , N 2 , • • • , N m articles
in each batch. The ith batch contains ni defective articles and N i - n 1
sound ones (i = 1, 2, ... , m). A batch is selected at random and k
articles from it are checked. They all prove to be sound. Find the
probability that the next l articles taken from the same batch will also
be sound.
Sotution. The hypotheses H1 , H 2 , • • • , Hm, where Hi = {the ith
batch is selected} have equal a priori probabilit ies P (HJ =
1/m (i = 1, 2, ... , m).
On the hypothesis Hi the conditiona l probabilit y of the observed
event A = {all the k articles checked were sound} is
P(A I Hi)=C;v.- n./C;v. (i=1, 2, ... ,
I I I
{3.46.1) m).
By Bayes's formula the a posteriori probabilit ies of the hypotheses are
m

P (Ht I A) = P (A I H1)/~ P (A I Hi) (i = 1, 2, ... , m). (3.46.2)


i=i
The probabilit y of an event B = {the next l articles taken from the
same hatch are sound} can be calculated from the total probabilit y
formula using the a posteriori probabilit ies given in (3.46.2):
m
P (B)= ~ P (H 1 lA) P (B I H 1A), where
i=1
p (B I HiA) = N;-ni-k N 1-n;-k-1 •
N1-k Ni-k-1
N1-n1-k- Z+1 .
• · • N 1-k-l+1 (l = 1, 2, · · ·' m). (3.46.3)
l nvmark. Formulas (3.46.1) and (3.46.3) are valid only fork<: N. - II· and
.t:::= " ! . - n1 - k. If these conditions are not satisfied, the correspondi~rr"' prob' abi"l-
1 Ies WI 11 be zero.
CHAPTI:R 4

Discrete Random Variables

4 0 The concept of a random va.-table lS ono of the most s!gDLficant tn probabth


ty thPory A random var1able is a vartable '\\htch as a result of an e'tpcr1ment '"1th
a random outcome assumes a certaln value Examples ( \) an exper1ment 1s to fife
four shots at a target tb.e random vartahle is a. number ol h1ts (tl) an exper.unent lS
the operatton of a computer the random variable lS the t1me the computer oper
ates un tu 1t fa 1ls
In the set theoretical lnterpretatton of the fundamental concepts of probabtltty
theory \\bleb v. e have adopted a random 'ar1able X 1s a functton of an ele1nentary
•v~nt w X ;= q> (ro) "\\here (I) f fJ The value of that iunchon depends on whlth
~lementary event m occurs as a result of an expertment
\ Ve shall de not a ran. dom va r1 ahles by cap1 ta 1 1et t &s and no ora ndom v a rw. bles
by !i mall Ietters
The law of probability dtstnbuhon of a random vartahle Js the rule used to find
the probahthty of the C\ent related to a random var1ahle for Instance the proha
h1ht) that the variab!~ assumes a certa1n value or falls 1n a certa1n 1nter' al If
a random var1able X has a g1ven d1str1button Ia" then It J:J sa1d to have such and
.a uch a d1s tr1 but 1on
The most genera I ror m of the d lS tr1 but I on la \\ Ls a dis trib u t 1on tunc t ton '' htch
is the probabtllt) that a random vartahle X assumes a value ~maHer than a gL' en
va1ue z 1 e
F (:t) - p {X < X} (4 0 1)
The d1stributlon ronctton F (x} for any random 'arlahle pOSQ.esse.s the rollo'Vlng
propertles F ( - oo) =
0,. F ( -- OC)) - 1 the functtoo F (x) does not decrease
w1th an 1ncrease 1n z The d1strtbut1ons of d1screte random vanabl~s have a '-ery
Ermple form A random variable 1s saJd to be ducrete If 1t hag a finite or countable set
of po~t:sible values
In the examples considered above the random varlab]e X 1 e the number ol
htts out of four shots being fired ls discrete and may take the \alu~s 0 i 2 3 4
The ~ecoad random variable T the ttme a computer operates unt!l1ts first !adure
is nondi ~crete 1ts po.ssthle value!!; con tJnuousl;,. fill a c ertatn part of the ahsctssa
a x1s and the1 r set 1s u ncou n tab 1e-
The s1mp 1est distrtbut1on for a dtscrete random vartable X ts an ordered samplt
or ardtred senes "'h1ch 1s a table ~hose \.op ro\v conta1ns aU the values of the random
variable .x1 x 3 :t1 1n the ac:cend1ng order and the .bot tom row conta1ns the
correspondLng probah.tltt1es p 1 p 2 p1

{4 0 2)
PI I p, J

PI :~;;t t
ef .an ordered sample (Ftg 4 0 1) is known as a [re-
F ) of a d1screte random variable X Is a d1scont1nu
4 '0 ~)1 \Vhose JUmps correspond to the pos~uhle
varia !lie X and are equal to the pro ba b 1h t1 es P1•
--- --- --- --- --- --= C:: .:.h ::..• .::4.:. . =D..:.is:.:c:.:.r.::.et:.:e:.. .:R:.::;.a:.:.:ndo m Variables 87

p., ... {Jf the>e valups respeLtivelv; betw een the junp s the funct ion P (.c) rema ins
constant. At a poin t of disco ntinu ity the funct ion F (x) is equa l to the value with
which it approaches the poin t of disco ntinu ity from the left (in Fig. 4.0.2 those
values are shown by dots). The funct ion F (x) is said to be "con tinuo us on the left",
i.e. when it approaches any poin t from the left, it does not suffer disco ntinu ity, but
when it approaches the poin t from the right , it may become disco ntinu ous.
The prob abili ty that a rand om varia ble X falls on an inter val from a to ~ (in-
cludi ng a) is expressed in term s of a distr ibuti on funct ion by the form ula
P {a,:;;;; X < ~} = F (~) - F (a), (4.0.3 )

or, in anoth er notat ion,


P {X E [a, ~)} = F (~) - F (a), (4.0.4)

where the'[ ' sign signifies that the poin t a is inclu ded in the inter val from a to ~
and the') ' sign signifies that the poin t ~ is not inclu ded into it.
F(x)
Pc 1
I
1 •


I
l
lp.z I
1 I
IP2
I Pt I
Pn :c· p,
:Ct :Dz lCJ a:n r.
x, 0 .xz
Fig. 4.0.1 Fig. 4.0.2
. The math emat ical expectation, or mean value, of a discr ete rand om varia ble X
ts the sum of the prod ucts of all its possible value s x· and their prob abili ties Pt '
i.e. '
M [X] = ~ XiPi· (4.0.5)
i

A rando m varia ble may ~ot have a mean value if this sum diverges. Whe n the
mean value of a rand om vana ble X has to he deno ted by one lette r, we write
M [X] = mx•
. A centr ed rando m varia ble is the difference between the rando m varia ble X and
1ts mean:
0

X = X - mx• (4.0.6)
The .variance of a rando m varia ble X is the mean value of the squa re of the cor-
respondmg centr ed rando m varia ble:
0
Var [X) = M [X2], (4.0. 7)
For a discrete rando m varia ble X the varia nce can he calcu lated by the formula
Var [X)=~
"'-l (zt-m :r) 2 Pi•
(4 .0.8)
i
For the varia nce of a r~ndom varia ble X we_ !'hall also use a desig natio n Var •
• xth
The mean squ_are dev_wtion, or stand ard dema tion ' of a rando m varia bl e X ts
square root of 1ts vana nce: e

a [X] == G:r =YV ar:>: (4.0.9)


(the arith metic , or posit ive, value of the root is always impl ied).
BB Ap plttd Probl~m' tn ProbabUttg Thtorv

The kth momtnt a~out the orfgin of a randont variable X 1! the kth PO""' ~r mean
value of that var1able
(4 0 fO)
For a dto10:crete random \ar1able X the first moment about the or1g1n can be cal
culated by 'he formula.
CCk IX]==~ Z'~PI (4 0 tl)
t

The kth ct, tral moment of .c. random \arl<~h1e ts the kth po"rr m~an
value of the corr~ro;pondnJg c~ntred value
0

fl~;. [X] = 'I (XhJ (4 0 t2)


For .a dJ~crete random v.ar1ab!e X the central moment ean h~ calculated by the for)
mula
1-111 (X]~~ (.rr-m%)1 PI (4 0 t3)
1
The mea.n \alue of a random varl.able X ts its first mom~nt about the orrgin and
the vartance 1s the second control moment
!I (X] = o:1 [X] \ ar IXJ = ll:~ [XJ (4 0 14)
C~nttal mom~nt~ aU eXpi'e.._s~d IU
terms (){ moments abo\1\ the Otlg\n
Jl-1 {X] = a% IXJ - m!
p., [X] = a.a{X] - 3mz:r.t..~ [X] 2m! + (4 0 15)

0 I fS p ec 1a 1 1m porta nee 1s the .6.~ t formula wh1ch ex prec:" rs the var1anc e 1n terms
of the ~econd moment about the orJgLn
Var [X] == a 1 (XJ - mi (':1 0 t6)
or 1n another notation
Var [Xl = ~~ [X 1
} - {!.t {X})'• (4 0 17}
l e tht l. artanct is equ.a l to tht mean r,..alue of th~ square of th~ random t anabl4 m 1nus
tht squart oj the m.ean
The indtcalo,. of an tl.enl A ts a dtscrete random vattable U ~h1ch posses~es tv.o
por.=slbla values 0 and 1
The random vartahle 1s equal to 0 If thee\ ent A does not (JCcur and to unity If
it occurs
U = { 0 for ru ~A~ (4 0 i8)
f for (J} E A
The ordered sample of the Indicator U or an event A bas the form
0 \ 1
u
qlp
where p IS the probahd1ty of the event 1n the expertment aiJ..d q == 1 - p
Tbe mean and the varJan~e o! the quanht) U are
1\ I ( U] = p and Va r [ U] = p q (4 0 19)
t'
res pee t ely
1n the theory ol proba b 111 t y tb e use oI t be Ill d 1 ca tors of even l! stJ hsta D bally
s1mphfies the $Olnhon of problems (see Chapter 7}
Ch. 4. Discrete Random Variables 89'

When calculating the characteristics of random variables, it is often convenient


to use the formula for comiJlete exiJectation: if n mutually exclusive hypotheses H 1 ,
H2 ... , Hn can be made concerning the conditions of an experiment, then the complete-
expectation of the random variable X can be calculated by the formula
n
M [X]=~ P (Hi) M [XI Hi], (4.0.20)
i=i

where M [ X l Hi] is the conditional expectation of the variable X on the hypothe-


sis Ht·
Tlie formula for complete expectation can he employed in calculations of mo-
ments about the origin of any order:
n k
ak[X]=~ P(Hi)M[X lHi]• (4.0.21)
i=1

In principle, this formula can also he used in calculating central moments of any-
order:
n o~
f-!J:[X]= lj P(Hi)M[X IHiJ, (4.0.22)
i=i
0

but it must he borne in mind that the quantity X in (4.0.22) must he calculated as
0
X= X - m:c, where m:c is the unconditional expectation of the random variable
X, as expressed by formula (4.0.20), rather than the conditional expectation on the
hypothesis Hi.
There are several frequently encountered types of distributions of random
variables.
1. The binomial distribution. A random variable X is said to be binomially dis-
tributed if it may take the values 0, 1, ... , m, ... , n, and the respective probabilities
are
(4.0.23)
where 0 < p < 1., q = 1 - p and m = 0, 1, ... , n. Distribution (A.0.23) depends-
on the two parameters, p and n.
From the theorem on repetition of trials [formula (2.0.15)] it follows that the-
number X of occurrences of an event in n independent trials has a binomial distribution.
For the random variable X, 'vhich has a binomial distribution with parameters p
and n, we have
1\I [X] = np, Var [X] = npq, (4.0.24}
where q = 1. - p,
2. The Poisson distribution. A discreterrandom variable X has a Poisson distri-
bution if its possible values are 0, 1., 2, ..• , ·m, ... , and the probability of the event
{X = m} is expressed by the formula
am
Pm=P{X=m}= ml e-a (m = 0, 1., 2, ... }, (4.0.25}

wh~re a> 0. The Poisson distribution depends on one parameter a. For a random
vanable X which has a Poisson distribution
M [X] = Var [XI = a. (4.0.26)
The Poisson distribution is the limit for a binomial distribution as p _,_ (}
and ':->- :-"'• given that np = a = canst. We can use this u" ·-''- •t>r ke
pro::nmat10ns when we have a large number of independent trials:~
' ~-,,..
' ,,.
an event A occurs with small probability.

Applied Problt ms fn Proba blltty Thtary

The Po~.~~on distrtbubons can also be used In the ease of a number of point! fal
ling 1n a g1\en region of space (one-d1menstonal two-.dtmen~Ional or three-thmen
srona1) 1f the location of the po1nt! 1n space 1s random .nnd satisfies certa1n rest
r1ettons
The ()Oe d1menstonal variant 1S encountered when flows or
events are conszdered
A flow .of ~"' en:U or t ra.[fi.e ts a sequence of homogeneous events occurr lng one after
another at random moments in time (for more detatl see Chapter 10)
The- averag-e number of events 1 occurring per untt time Is knovm as the 'nte"mflu
of the no,, The quanttty 1 can he either constant or vartable :\ ::::;; )-. (t)
A flo'\\ of events 1s satd to be without afterefftct.t tr the proba.b1hty of the numller
of e' ents falling on an 1nterval of tame does not depend <Jn tho number of -events
falhng on any other non overlapfing Interval
A !low of e' ents lS ord1.narg 1 the probability of t\\o or more events occurnng
on an elementar} tnter,al &t 1s neghg1bly small as compared to the probabthty of
the occurrence of JUSt one event
An otd1nai'y flow c f even\s w 1thou t an att-el'enect lS known as a Pa ~swu fl.ow at
traffic If c~rta1n events form a Potsson flo\v then the number X of events falling-
.on an arb1trary time 1nterval (t 0 ta +
1'} has a Po1.~.son dJstr1huhon
am
Pm= mJ e a (m=O 1 2 ) (4 0 27}

where a 1s the mean value of the number of points falhng on the tnterval a=
to+"I

l
to
A (t) dt and )w (t) 1s the tntensl ty of the flo,,

"'==
lf canst then the PoiSSOn now lS Sald to be a stahonarJ. or tlemrntary
flow· of the Po1.sson t} pe For an elementarl fiol\ the number of event~ falling on
an} hme nterval of length "t h~ a Por~son dtstr1huhon "1th paramter a ~ ~t
A random field of po~nts 1s a collection of polnts tdndoml~ scattered o\er a plane
(or 1n space)
The trz.tenzlty {Or den.sfty) of a field ~ 15 the mean number of poznts \\b1ch fJll Jn
.a unit area {un1t volume)
A field of potnts ts sa1d to be oi the Po1Qson type If 1t possesses the £ollow1ng prop-
erties (1) the probab1hty of a number of po1nts falhng 1n an} reg10n of a plane
(space) 19 tDdependent of the number of po1nts fallro.g 1n any otb.er non--overlapp ng
reg ton (2) the pro ba bIll t y of t \vo or more p o1n ts fallt ng Ln the e1emr n tary region
/:..x!J.y IS neghgtbly small as compared to the probab1hty of one po1nt Calhng tn thai
f'e g1on (the property of ord 1nar1 ness)
The number X of points 1n a Po1-=son field fdlbng 1n any reg1on Sofa plane
{space) has a Poisson d 1stn but 10n

P
m
= 4
ml
m e a. (m = 0 t ) (4 0 28)

where a l.s the mean value ol the number ol points lalhng 10 the rerr1on S If the
tntenst ty of the field itat (x y) =-== iu := const the field 1s said to be h~mageneous {a
property s1m1lar to the stahonar2ness of a flow) For a homogeneous field w1th 1n
tens1 ty ). \\ e hn ve a :::::; s'i.. 'v here 1 IS the area (' olume) of the region S If a .field t!
•nhomogeneous then a = ~J(S)
h(;r v) dz dv (for a plane) and a= ~ JI
~S)
1-.{z ll 1) X
dx dy d: (for ctpace)
\\hen doing problems related to a PoLS.son distrthuhon 1t 1s conven1ent to usa
am ~ ~
tables of the function P (m. a)= - e u (see AppendiX 1) and R (m a) = ~ ]JB ~
m1 h~o
{see Appendtx 2) The last function 1s the prohabthty that the random var1ahle Xt
wh1ch has a Po1~son distrLbutlon w1th parameter a assumes a value not exceeding m
R (m a) = P {X~ mJ
· .Ch. 4. Discr ete Rand om Varia bles 91

3. A geom etric distr ibuti on. A rando m varia ble X is said to bay~ ~ geometric
distri butio n if its possible value s are 0, 1, 2, ... , m, ••. and the prob abrht res of these
values are · .
Pm= qm-l p (m=O , 1, 2, ... ), (4.0.29)

where 0 < p < 1, g = 1 - p. . .


For a seque nce of value s m, the prob abili ties Pm f~rm an u~fini~ely decreasu~g
.geometric progr essio n with a comm on ratio q. In practiCal appl~catiOns, ge~metnc
,<J.istrihutions are enco unter ed when a numb er of indep ende nt tnals are carri ed out
to reach a resul t A· in each trial the resul t is achie ved with prob abili ty p. The ran-
dom varia ble X is the numb er of "useless" trials (till the first trial in whic h the even t
A occurs).
The order ed serie 3 of the rando m varia ble X has the form
Of 1f2 j ... fm f ...
X: I--~~--~--~--~--.
I I
P qp q2P qmp I ... I I ...
~

The mean value of the rando m varia ble X, whic h has a geomet'ric distr ibu-
tion, is
M [X]= q/p, (4.0.30)
and its varia nce IS

Var [Xl = qlb 2• (4.0.31)

The rando m varia ble Y = X +1 is often consi dered , whic h is the numb er of
trials made till a resul t A is achie ved, the successful trial inclu sive. The order ed se-
ries of the rando m varia ble Y has the form
1j2 j3f .. ·l m , ...
Y: (4.0.32)
P I qp I q P I ... I gm-l p I ... '
2
' .

M [Y] = 1/p; Var [Y] = q/p2. ! (4.0.33

We shall call the distr ibuti on of the rando m varia ble Y = X +


1 a geom etric
distr ibuti on begin ning with unity .
4. A h:ypergeometric distr ibuti on. A rando m varia ble X 'vith possi ble value s
(), 1, ... , m, ... , a has a hypergeometric distri butio n with param eters n, a, b if
Pm = P {X = m} = C'{fcg-mJc;;+b (m = 0, 1, .•. ,a)*> . (4.0.34)

.A hype rgeom etric distr ibuti on occurs when there is an urn whic h conta ins a
whrte and b black ~ails and n balls are dra,¥11 from it. The rand om varia ble X is
the numb er of whrt e balls draw n and its distr ibuti on is expre ssed by form ula
(4.0.34).
The mean of a rando m varia ble whic h has distr ibuti on (4.0.34) is
M [X] = nal(a + b), (4.0.35)
and its variance is

(4.0.36)

"'l When form ula (4,0.34) is used, we must a:;sume that c;; = o if r > k.
92
- Applted Probleml tn Probablllty Thtory
-
Ptoble1ns and Exerclses
4.1. Construct the drstributJon function of the lndtcator U of an e\ent
A whose prob~blltty lS p
Solutton .
0 for x ~ 0,
F (x)= p {U <x} q for O<z~ 1,
1 for x > 1t
V!hete q = t - p (seQ F"g 4 1.)
4.2. A co1n ts tossed three t1mes A random 'ariahle X 1S the heads
obta1ned Construct fo1 l.t {1) the ordered ser1es (2) the frequency poly-
gon (3) the d1strlbut1on functton Ftnd ~I [XJ. Var lXI and a$
F{x) Pt
1

0 1
0 1 2 j :r,
Fig. 4 I (a)
F(:r)
1 -~ . . . . . ------...lj--

f{z)
'
~

1 - i
\
r
~

r
I
l
0 1 2 3 r
0 a (/J)
f1g ~ 3 Fig 4 2
Solution~
The var1ahle .Y. bas a h1nomtal distribution "1th parame-
ters n = 3 and p = 1/2 The ordered ser1es 1s
X oJ t 12 ( 3
1!8 I 3/SI 3/8,1/8
The frequency polygon 1s shown 1n F1g 4 2a and the d1str1bUtlon
iunct1on IS shown 1n F1g 4 2b
4 3 Cons1der1ng a nonrandom var1ahle a to be a special ca.se of a ran·
dom variable, construct for 1t (1) the ordered series, (2) the dlstrtbutton
function, (3) find Its mean and variance
r Answer. (1) a ![ I• (2) the d1stnbu twn functwn IS shown m F1g 4 3,
I
=
(3) 1\ I [a J a • a r [a J = 0.
Ch. 4. Discrete Random Variables 93

4.4. A constan t nonran dom variabl e a is added to a random variabl e X.


How does this ariect the charact eristics of the random -variable: (1) the
mean value; (2) the varianc e; (3) the mean square deviati on; (4) the sec-
ond moment about the origin?
Answer. (1) A term a is added; (2) it does not change; (3) it does not
change; (4) a term a2 +
2amx is added (since a 2 [X] = Var [X] m,i). +
4.5. A random variabl e X is multip lied by a. How does this affect its
charact eristics : (1) the mean value; (2) the varianc e; (3) the mean square
deviation; (4) the second momen t about the origin?
Answer. (1) it is multip lied by a; (2) it. is multip lied by a2 ; (3) it is
multipl ied by I a I; (4) it is multip lied by a2 •
4.6. We toss a coin n times and consider a random variabl e X which is
the number of heads obtaine d. Constr uct the ordered series of the ran-
dom variabl e and find its charact eristics , i.e. mx, Varx, ax, !ls [X].
Answer.

X:
I
0 1 j ••• j m j ... j n
(1/2)n I Cfi {1/2)n I ... I CJrl (1/2}n I ... I (1/2)n '

mx = n/2; Varx = n/4, ax= Yn/2, ~t 3 [X]= 0

(since the distrib ution is symme tric about mx = n/2).


4. 7. We carry out n indepe ndent trials in each of which an event A
may occur with probab ility p. Write the ordered series of the random
variable X which is the numbe r of occurrences of the comple mentar y
-
event A in n trials, and find its mean and varianc e.
Answer.
oj 1 1 ... j m j ... jn
X: l--pn-+~-C-h-qp-n---1~~~•.-.-~~C-W-qm-p-n---m-+1-.-.-.+~-q-n-l'

where q = 1 -- p; mx = nq; Varx = npq.


4.8. We carry out n indepe ndent trials, in each of which an event A
may occur with probab ility p. We consider a random variabl e R which
is the frequency of occurrence of the event A in n trials, i.e. the ratio
of the numbe r of occurrences of the event A in n trials to the total num~
her of trials n. Write the ordered series of the random variabl e and
find its mean and varianc e.
Answer.

R: o 1 1tn 1 .•. 1 mtn 1 •.• 1 1


qn j Chpqn-1 j ... I CWpmqn-m I .•, I pn '

wher~ q = 1 -- p, m:c = p, Var:c = pq/n.


4.9" · We carry out. n indepen~ent trials. The probab ility of occur-
rence of an event A IS the same m all the trials and equal to p. Find
the most probab le numbe r m* of the occurrences of the event A.
94 Applied Probl~ml in Probability Theory

Solution. \Ve find the condition for ,vbtcb n"t• = 0 lf m• ~ 0, tbei


qn > C~pqn-t or q > np, 'vhence p < 1/(n 1) lf m* = n, thei +
pn > C~qpn- 1 1 p > nq, '\hence p > nl(n 1) +
Let us consider the case v. ben 0 < m* < n, tn thts case t\\ o tnequah·
tres must be stmultnneously sattsfied
c:'•pm*qn- m• ~ c~it+1 pm*+lqn-m•- t'
C~ ... prrr• qn -m• ~ c:•- p"m*I -1 qn- m* +1
These two tn~qualtttes are eqntvalent to the following Inequalities
(m* + 1) q~ (n- 1n*) p., {n- m* + 1) p ~ m*q1:
whence m* must be an Integer satisfying the tnequal1t1es

(n + 1) p - 1 ~ m* ~ (n + 1) p
It can bo 'ertfied that thts condttton 1s also sattsfied tf p < 1/(n t) +
(m* = 0) or 1n another extreme ca~e If p > nl(n + 1) (m* == n)
Since the rtght hand stde e'\cecds the left hand stde by un1ty, there lS
only one Integer m * bet'' een them The only except ton ts tho case ,vhen
(n + 1) p and (n 1 1) p - 1 are 1ntegers Then there are two very tm-
probahle values (n + 1) p and (n + 1) p - 1 If np Is an Integer.
then m* = np
4 10~~ Two marksmen fire at t l\ o targets Each of them fires one shot
at h1s target tndcpendently of the other marksman The probabtltty of
htttlng the t'lrrget 1s p 1 for the first marksman. and p 2 for the second T"o
random 'artables are constderedt v1z X 1 the number of ttmes the
first marksman htts hts target and X 2 the number of trmes the second
marksman htts the target and thetr d1Herence Z = X 1 - X 2 Construct
tbe ordered series of the random variable Z and find 1ts charactertstlCS mz
and Var.z
Solut1on The random variable Z has three posstble values -1,. 0
and +1
P{Z=- -1}=P{X,=0}P{X 2 = +1}=1JtP 2 t
P {Z =0} =P {X1 =0} P {X, =0}
+P {X 1} P {X s = 1} qtq'J + PrP2'
1 ::;;:__ =
P {Z ==-1} =P {X 1 = 1} P{X2 =0}=p 1q'l,
'vhere q1 = 1 - p 1 , q2 = 1 - p 2
The ordered ser1es of the v ar1able Z has the form

z -1 ) 0 I t
•q1P:~ j 91 qr+ P1P.z l Ptfa •

m" = (-1) fJtPt. + 0 (qtgl + P1P2J + iptp~ = - gtP:+ ip1q2 = p,.-. Pt 41


"'Ch. 4. Discrete Random Variables 95

We can find the variance in terms of the second moment about the
origin [see (4.0.16)]:
+ + +
a 2 [ZJ = ( -1)2 • q1P2 02 ·(qtq2 P1P2) rz.. P1q2
= q1P2+ Ptq2=P1 P2-2P1P2 +
Varz=CX 2 [Z] -m~= Pi+ P2-2P1P2- (Pt- P2) 2

= P1q1 + P2q2.
4.11. Two shots are fired independently at a target. The probability
of hitting the target on each shot is p. Tl1e following random varia-
bles are considered: X, which is the difference between the number of
hits and the number of misses; Y, which is the sum of the number of
hits and the number of misses. Construct an ordered series for each of
the random variables X andY. Find their characteristics: mx, Var:\_, my~
Vary.
Solution. The ordered series of the variable X has the form
-2 1 o 1 2
X: , where q= 1- p.
q2 I 2pq I p2
mx= -2q2 +2p2 =2 (p- q); a2 [X]= 4 (q2 + pz),
Varx= a 2 [X] -m~= 8pq.
The variable Y is actually not random and has one value 2; its or-
dered series is
2
Y: 1-.
1 '

4.12. We have n lamps at our disposal, each of which may have a de-
fect with probability p. A lamp is screwed into a holder and the current,
is switched on. A defective lamp immediately burns out and is replaced
by another one. A random variable X, the number of lamps that must
be tried, is considered. Construct its ordered series and find its mean
value mx.
Solution. The ordered series of the variable X has the form

X: 1]2131···1 i I ··· I n where q=1-p.


q I pq I p'!.q I ··· I pi-lq I ... I pn-1 '
(4.12.1}

4.13. A random variable X has a Poisson distribution with mean


value. a = 3. Construct the frequency polygon and the distribution
functwn of. the rand~m variable X. Find the probability that (1) the-
random vanable X Will assume a value smaller than its mean· (2) the
random variable X will assume a positive value. '
~6 Appl ed P oblem1 in Proba.bllltv Thtorv

\ns~cr
(1) 0 423 (2) 0 960
4 14 ''hen a computer 1s operating 1t fatls (goes down) from tl.Dle
to tune The faults can be considered to occur 1n an ele01entary flow
The n\erage number of failures per day IS 1 5
F1nd the probahtiitles of the followlng events
A - {the cotnputer does not go down for two days}
B - {1t goes do,vn at least once dur1ng a day}
C =
{1t goes do\\n no less than three t1me3 dur1ng a week}
Ans\\er P (A) ~ 0 050 P (B) = 0 777 P (C) = 0 998
4 15 A shell landing and exploding at a certain pos1tion covers the
target by a homogeneous Poisson field of spl1nters 'v1th 1ntens1ty l = 2 5
splinters per sq metre The area of the proJeCtion of the target onto the
plane on \~ htch the field of spltnters falls ts S - 0 8 m 2 If a spltnter
h1ts the target 1t destroys 1t completely and for certain F1nd the proba
bility that the target '\VIII be destro) ed
Ansl\er 0 865
4 16 The same problem but each spl1nter h1tt1ng the target may
destroy 1t 'v1th probabtl1ty 0 6
Solution 'Ve corunder a f1eld of affec t1ng spltn ters With dens1ty
"-* = 0 6 A. = 1 5 spltnters per sq metre rather than the assigned field
<Jf spltnters The mean value of the number of affecting splinters httttng
the target zs a* = l*S ~ 1 2 splinters hence the probab1l1ty of de
struct1on R 1 = 1 - e...a* = 1 - 0 )01 = 0 699
4 17 A vacuum valve operates w1thout a fa1lure lor a random t1me T
\Vhen 1t fails 1t IS 1mmedtately replaced by a new one The flow of
fa1lures 1s elementary and has 1ntens1ty 11 F1nd the probability of A =
{the valve 'v1ll not be replaced durtng the time -r) B :::=:::;: {the valve
w1ll be replaced three ttmes} and C { = the valve w1ll be replaced no
less tb.an three t1mes}
Solution The mean value of the number of fa1lures X during the
t1me 't Is a = !J."'t

P (A)= P0 = e W'" P (B)- P3 = (";t e-P'C

P (C)= 1- {P0 + P1 + P!)- 1- e ).l't 11- J-t't"- 0 5 (1-1-rYJ.]


li 18 A dev1ce consiSts of three un1ts the first of wh1ch contatns 1tJ.
elements the second n 2 elements ar the thtrd n 3 elements The
first unit ts obv1ously necessary for the dev1ce to function the second
and the thtrd un1ts repeat each other The flows of failures of the ele
ments are elementary For the elements of the first un1t the 1ntens1ty of
tJub 12m,~ V1 ~~liUl~ ·n, ~ ·vlul:tb .e1ut \~ 11.l1mflflfu 1.h '\lru "b'l!L~ mdt 4 -nt.ni.
un1ts 1t 1s A2 The first unit fatls 1f no less than two elements fa1l The
second un1t (as well as the repeat1ng tbttd un1t) fa1ls I[ at least one ele
ment falls For the whole dev1ce to fa1l 1t ts sufficient for the fi.rst unit
to ia1l O? for both the second and third untts to fa1l F1nd the probab1l1
ty that the device Will fa1l dur1ng the t1me 't'
Ch. 4. Discrete Random Variables 97

Solution. The probabilit ies that one element from the first, the second
or the third unit will fail during the time 't" are
Pt = 1 - e-l,t't and P2 = Pa = 1 - e-P.2't
respectivel y. The probabilit y that the first unit will fail during the
time 't" is
ff\ = 1 - (1 - PI) 111 - n1P1 (1 - Pt)111 - 1 •
The probabiliti es that the second and the third unit will fail are
& 2 = 1 - (1 - p 2 )n2 and & 3 = 1 - (1 - p 3 ) 11 3,
The probabilit y that the whole device will fail is
ff = &1 + (1- B\) ffl 2 ffla•
4.19. An earth satellite, orbitting for n days, may collide with
a meteorite at random. The meteorites which cut the orbit and collide
with the satellite form a stationary Poisson flow with intensity x (mete-
orites a day). A meteorite which hits the satellite may hole its shell
with probabilit y p 0 • A meteorite which holes the shell disables some of
the instrumen ts in the satellite with probabilit y p 1 • Find the proba-
bilities of
A = {during the time of orbitting the shell of the
satellite will be holed};
B =
{the instrumen ts will fail during the orbitting of
the satellite};
C = {during the orbitting of the satellite only its
shell will be holed but the instrumen ts will not be
disabled}.
Solution. The mean value of the number of meteorites which hole
the shell a 0 = xnp 0 • The mean value of the number of meteorites which
hole the shell and damage the instrumen ts a 1 = xnp 1 p 0 •
p (A)= 1-e-ao = 1-e-xnpo , p (8) = 1- e-al = 1-e-xnPo Pl,
p (C)= p (A)- p (B)= e-xnpopt- e-xupo.
4.20. A group of hunters have gathered to hunt a wolf and set off
forming a random chain that can be described as an elementar y flow of
points wiLh intensity f.. on the x-axis (A. hunters per unit lenoth 1 see
Fig. 4.20). The wolf runs at right angles to the chain. A hunter fires
a shot at the wolf only if the wolf runs at a distance no larger than R
fr?m him and, having fired a shot, may kill it with probabilit y /
Fmd the probabilit y that the wolf will be killed if the animal does not
know where the hunters are a_nd the chain is long enough for the wolf
not lo be able to run beyond It.
. Solution. If the wolf runs in the direction indicated by the arrow it
1s fll'ed at if at least one hunter gets into a strip 2R 0 wide connected ,;ith
7 -(\57 5
98 Applied Prob1eml tn Probabi lity Theory

the path of the" olf~s d1splac ement E' cr~ hunter 'vho fires at the v..olf
may be success ful, 1 e ]\Ill the'' olf 'vtth prohab tltty p Let us pass from
the ch a1n of hunter s 'vh 1ch has 1n\ens1tv i.. to the "chal n of succts~t\\\
hunters'" '' htc h has 1n tensIty 'A • = 'Ap The '' olf \Vtll be ktlled 1f at
least one successful hunter gets 1nto a strip 2R 0 long thro\vn at random
on to the a bsc 1ssa ax1.s The -prlJ.o
babtll ty of th1s ts
2Ro
p (A)= 1-e-2R oAP
4 21. A car ts subJect ed to
an 1nspect 1on and matntenance
The number of faults detected
...l-...(")-o----<).-(]~~,.';64+--o-~>--.....o--x- d ur1ng the 1nspect 1on has a
0
Po1sson dtstrtb ut1on \v1th para
n1eter a Jf no faults are detect•
the n1a tn ten a nee lasts for
ed ~

an a" era.ge of t'' o hours If one


or t'"'
o faults are detected,.
anothe r bali hour 1s spent el1m1nat1ng each defect 1 f mo-re than \W()
faults Rre detecte d the car tales four hours on the average to be re--
patred Ftnd the distrlb utton of the a' era go ttme T of matntenance and
repa~r of the car and 1ts mean value .l'I {Tl
Soluho n.
6
T

J1iT) =e¢(2 +2.5a +f a'-)+G[t-e-~ {1+a+ ~-}]


~ 6-e-a (4 + 3 5a + 1 5a2).
4.22*- A group of antmal s 1s exam1n ed h} n 'et and each may he found
to be stck \Vtth probah tltty p The e"inmt nation proced ure In\ ol ves a
blood test on a sample blood taken from n an1mal s and mtxed The
m1xtur e w1ll y1eld abnorm al test results 1f at least one of the n animals
is s1ck A large numbe r N of animal s must be examtn ed T"o method s of
examin ation are sugges ted
(1) all N anlmal s are exam\n ed. 1n 'vh1ch ca~e .1V blood tests must
he made,
{2) groups of an1mals are examtn ed, first mrttng the blood of n ant
mals If the test ytelds normal results , then all th~ an1mal s 1n the
group are constde red healthy and the next n an1rnal s are- examtn ed If
the results are abnonn al, each of then an1mals IS examin ed and then the
next group o£ animal s lS examtn ed (n > 1).
Ftnd '\Vhlch of the method s ts more advant ageous 1n the sense of the
mtntm um average numbe r of tests F1nd n =
n* at 'vh1ch the m1n1mum
average numbe r of tests 1s needed to examtne the animal s
Solutio n .. A random variabl e Xnt wh1ch is the numbe r of tests needed
Ch. 4. Discrete Random Variables 99

for a group of n animals examined by the second method, has the fol-
lowing ordered series:
q=i-p.

The average number of tests for a group of n animals examined by the


second method is
l\1 [Xnl = qn (n + + 1) (1 - qn) = n (1 - nqn). +
When the first method is used, n tests are needed for n animals. For
nqn < 1 the first method is, evidently, more. advantageous than the
second, while for nqn > 1 the second method 1s preferable.
Let us find the q for which the second method becomes more advanta-
geous. What then is the optimal value n = n*? It follows from the
inequality nqn > 1 that q > 11rn
and hence that q > 0.694 since the
n
minimum of 1f11 for an integer n is attained at n = 3. Let us assume
that q > 0.694 and find the value n = n* for which the average number
of tests per animal is a minimum:
Rn = l\1 [Xnl/n = 1 - qn + 1/n.
We must find the least positive root of the equation
dRnldn = - qn ln q - 1/n2 = 0,
and having done so take the two closest integers, substitute them
into the formula for Rn and then choose the optimal n*. By using the
substitution -ln q = a, an = x, the equation -qn In q = 1/n2 can
be reduced to an equation x2 e~-.: = a (a = - ln q < ln 3/3 = 0.366).
For small values of a (and, hence, for small p = 1 - q) the last equa-
tion has a solution X ~ 11 a, whence n* ~ 1/V a. If the values of a are
not small, then a direct comparison of the quantities R 2 , Ra and R 4
shows that R 3 is always smaller than R 2 and that R 3 < R 4 for 0.694 <
q < 0.876; consequently, for 0.694 < q < 0.876 the optimal n* =
3. We can show that for q > 0.889 (p < 0.111) the formula n* ~
1/lfp + 0.5 is a close approximation.
4.23. A number of trials are made to switch on an engine. Each trial
may be successful (the engine is switched on), independently of the
other trials, with probability p = 0.6. Each trial lasts for a time 't.
Find the distribution of the total time T which will be needed to switch
on the engine, its mean value and variance.
Solution. The number of trials X is a variable having a geometric
di~tri~ution beginning with unity (4.0.32) and T = XT has the dis-
tributwn
T: • I 2• I 3T I ... I m• I ...
PI qp l q p I···
2
lqm-lp t ••• '
M [T] = 't~I [X]= 7:/p, Var [T] = -r2 Var [X]= T2qfp2 (q= 1- p).
i•
100 Applted Probfemr in Probability Theory

4 24 Under the condttxons of the proccd1ng problem the trrals are


dependent and p 1 1s the probabdtt~ that the engJno can be s" 1tched on
after , - 1 unsuccessful trtals, P1. be1ng a functton of lt 1 e p 1 = ftl (l)
F1nd the dtstrtbut1on of the random vnrtable T = TX
Ans,,er
T
't I
2T ~ 3"t f mT f I
--~--~----~--+-------~--~
m-1
n
i-l
qtPm

~here q1 = 1 - p,
4 25 \Vben a rel11.ble Instrument constst1ng of homogeneous parts
1:5 as.sembledt each part xs subJected to~ thorough 1nspectton as a result
of'' brch 1t 1s etther found to be sound(\\ tth probab1ltty p) or l.S reJectec
(w1th probability q = 1 - p) The classtficat1on of each part lS 1nde
pendent The store of ptrts ts practically unlimited The selectton o:
parts and thetr 1nspect1on go on unttl k h1gh qualtty parts are selected
A random 'V'ar1able. X 1s the number of re-Jected parts F1nd the dtsttl
button of the random variable Y P m = P {X ~ m}
Solutton The possible" a lues of the random 'ar1able X are 0 1
m, 'Ve find their probablltttes
p 6 == p {the r~rst k parts are sound} = pk
P1 = l' {one part out of the first "" ts reJected the
(k + 1)th part IS sound} = Clq 1p" 1p =
Cl,.qlpl
• • ~ • 6 • •

Pm = P {m parts out of the first k -r m - 1 parts are


reJected the (m +
k)th part IS sound} ===
Cli'+m-1qmpk (m = 1 2 )
4 26 T'vo random 'artables X and Y mav be e1 ther 0 or 1 tndepend
ently of each other Thetr ordered sertes are
X Oil y olt
qx j Px qy { PrJ
Construct the ordered series (1) of thetr sum Z = X + Y, (2) of thetr
dtfference U = X ~ }~, (3) or their product V ~ XY
Soluhon (1) The random vartabla Z has three possible values
(l, 1 and 2
P {Z = 0} = P {A.=== 0, Y 0} = qxq11 =
P {Z = 1}=P {X= 1 Y=:O} +P {X =0, Y= 1}
= Pxqu + qxP1J1
P{Z=2}=P{X=1 Y=1} p~p11

z o r t I 2
qxqu J P-tqy + q~ Pv I P:xPu
Ch. 4. Discrete Random Variables 101

'
'
By analogy we lind that
-1 1 I 1 o
U: 1-----T--------~~----1·
q;xpy I PxPy+qxqy Pxqy

V:
0 I 1

qxqy+Pxqy +qxpy I PxPy

4.27. A random variable X has an ordered senes
1 1 2 1 4
X:
o .5 1 o .2 1 o .3 ·
Construct the ordered series of the random variable Y = 1/(3 -- X).
Answer.
- ~ 1 o.51 1
-'

Y: •
o.3 1 o.5 1 o.2
4.28. A random variable X has an ordered series
-2 1 -1 1 o 1 1 1 2 1
X:
0.2 I 0.3 I 0.210.2 I 0.1 I"
Construct the ordered series of its square: Y = X 2 •
Answer.

Y:
o 1 1 1 4
o.2 1 o.5 1 o.3 ·
4.29. When a message is sent over a communic ation channel, noise
hinders the decoding of the message. It may be impossible to decode
the message with probabilit y p. The message is repeated until it is
decoded. The duration of the transmissi on of a massage is 2 min. Find:
(1) the mean value of the time T needed for the transmissi on of the
message; (2) the probabilit y that the transmissi on of the message will
take a time exceeding the time t 0 at our disposal.
Solution. (1) The random variable X, the number of trials to send
a message, has a geometric distributio n beginning with unity; T =
2X min. The ordered series T of the random variable is
2 1 4 1 6 1 . . • 1 2m 1 ...
T:
q I pq I pZq I ··· I pm-lq I ... ·
00 00

(2) P {T > t 0} = 2J pm-tq = q 2J pm-1, (4.29)


m=[fo/21+1 m=[fo/2]+1
where [t 0 /2] is the largest integer in t 0 /2. Summing up the geometric
progression (4.29), we obtain
p[fo/2]+1
P{T> t 0} = q l = p[to/2].
1 -q -p
1.{)2 ,Appi ed Pro~Iems in Proba.bU ty Thtory

4 30 A d1~crete random vartabl e X has an ordered sertes

X
i :rR

A random 'arJabl e Z ts the mtntmu m value of t'vo numbe rs-the va1na


of the random \ ar1ahlo Y and the number a 1 e Z = m1n (X a}
''here x1 s;; a~ rn. Ftnd the ordered ser1es of the random var1able Z
Solut1o n By defin1 t1on
X {Qr X<.a
z==:!f!
a for X>a
The ordered sertes of the random vartable Z coincides '" tth that of
the random v ar ~able X for the ' al ues :r1 .x 2 \Vh l'C.h are smaller
than or equal to a P {Z - a} can be calculated as untty m1nus the
sum of all the other prohab 1h ttes
P {Z =a) = 1- ~ p1
x,-~n

4 3t The ordered ser1es of a dtscret e random variabl e X 1S


t 1 a l s 1 1 1 9
X
o 1}02]0 a}o a)o 1

F1nd the ordered sertes of the random var1abl e Z mtn {X 4} =


Ans,ver Pro~eedtng from the solutto n of the preced tng problem \\e
ba\e

z 1 l 3 ' 4
Otl02 f07

If 32 T'vo random var1ables X and Y tndependently of each other


assume a 'alue accordi ng to the follo\v1ng ordered series

X
o{t 12~ a y
02loa Jo4Jo t
Constru ct tl e ordered sertes of the random \ art'1ble Z = mtn {.!, Y}
Solutio n
P {Z=O} = P{Y =0} ~0 2
P{Z= 1}=P{ X=1} +P{Y -=1 X>1}~01+07 05=0 65
P{Z 2}----. P{X-2 Y>2} =0 4 0 3-01 2
P{Z= 3}=P{ X 3 Y>J} -01 03~003
z o It l2fa
0 20 ' 0 65 J 0 121 0 03
Oh. 4. Discrete Random Variables 103
-
4.33. Under the conditions of the preceding problem, find the ordered
series of the random variable U = max {X, Y}.
Answer.
1[2[3[4

.I
I I I
U: 0.35 0.28 0.27 0.10 .
l 4.34. Ann-digit binary number is written in a computer cell and each
\
sign of the number assumes the value 0 or 1 with equal probabilit y
independen tly of the other signs. The random variable X is the n~~~er
of signs "1" in the notation of the binary number. Find the probah1ht1 es
of the events {X = m}, {X> m}, {X< m}.
Solution. The random variable X has a biriomi"al distributio n with
parameters n, p = 1/2, P {X = m} = C~1 (112)n,
n
P{X~m}=(1/2)n ~ C'i:, P{X<m} =1-P{X >m}.
l:=m

4.35. A proper three-place decimal fraction X is considered , each


sign of which may assume each of the values 0, 1, ... , 9 with equal
probability independe ntly of the other signs. Construct the ordered
series of the random variable X and find its mean value.
Solution. The possible values of the random variable X are 0.000,
0.001, 0.002, ... , 0.999. The probabilit y of each of them is p =
(0.1) 3 = 0.001. The ordered series of the random variable X has
the form
X· o.ooo \0.001\0.00 2\ ... \ o.999
· o.001 1o.ooq o.ooq ... 1o.001 ·
999 999
M[Xl= 2J XiPi=0.00 1 2J xi, where xi=i·0.00 1.
i=O i=O

The numbers x, form an arithmetic progressio n consisting of 1000 terms


with a common difference 0.001. Summing up the progressio n, we obtain
·1000· 0.001 = 0.4995 ~ 0.5.
999
l\I [X]= +~·
0

4.36. A random variable Y is a random proper binary fraction with n


d~cimal places. Each sign, independe ntly of the other signs, may be
e1ther 0 or 1 with probabilit y 112. Find the ordered series of the random
variable Y and its mean value M [Y].
Solution. As in the preceding problem, all the values of the binary
numbm· from 0.00 ... 0 to 0.11 ... 1 are equally probable and each of
t!IC'm has a probabilit y 112'1 • The mean value of the random variable Y
(m the decimal notation) M [YJ = 0.5-1/2n+ l.
4.3i. A message is sent over a communic ation channel in the binary
code and consists of a sequence of symbols 0 and 1 alternating with an
equal probability and independe ntly of each other ~a~r 0 0 1 1 1 0
.
1. 1' 1' 0 ' 0 ' 0 '0, 1, 0, 1, 1, 1, .... ' "' ' ' ' ' ' '
i.04 Appltld Probltms fn Prob«bthty TheorY

A run of the s'lme symbo1s 1s constdered, say, 0 0, 0 or 1 J t 1 1 (or


stmp\y 0 or 1. 11 the s-ymbo\~ a:re no\. T{rpeate:d) An1 one ~f th~e 1\\tl,. 'J.
taken at random A random var1ablo X 1S the number of Slmhols 1n
a run Ftnd tts ordered ser1es mean valoe and \ ar1ance Ftnd P {X# ~J
Solution The random vartable X has a geomelrtc dtstrtbutton begin
n1ng '\ 1th unlt}

X
1 l 2 { ( m l
o s J o s' 1 J o sm J '
CliO'

VarIX]== 0 5/0.51 = 2 P{X~k}= ~ 0 5m=0.5h-i


m.~lt

4 38 Under the condtl1ons of the preced1ng problem P { 0"} ::::::: 0 7


P { r~) = 0 3 F1n.d the a\ erage number of s~ mbols !\{ lX1 tn a run
of zero~ the 1' erage nun1ber of symbols .I\ I [Yl 1n a run of un•t•e~ and
the over1Il a\ erage number of S} mhols ~~ [Zlln a run of S) wbols cho~en
at random Flnd the var1anc~ Var {.A 1 Var {Y) Var IZ]
So Iutton ~I (X I -- 1/0 7 - t0/7 'I [YJ == 1/0 3 _... 10/3 B} the
formula for the complete expectat10n ''e find that '1 [Z] = 0 7 ~ +
0 3 ~ = 2 1 e the same as m the precedmg problem VarIX] d
0 3/0 72 = 0 612 Var [Y] = 0 710 32 = 7 778
To f1nd Var lZ1 \\e r~rst flnd a. 2 IZl \1 {Z2 1 by the formula for the
complete e"t.pectatton [see (4 0 20)1 *)
a 2 {Zl =0 7 et 2 lXJ + 0 3 a 2 iY} 'vhere a 2 L\ 1 = Var IX]+ ('1 IXJ)t~
2 66 a 2 [Y] ~ Var fY] + (1\I [YJ)~ ~ 18 9 then a'" fZJ :::::: 7 53
Var {ZJ - a. 2 { Zl - (~1 lZ1) 2 ~ 3 53 1 e the last var\ance l.S larger
than that ln the preced1ng problem
The probability P {X~ l:t} can he found by the total probab1hty
formula \VJth hypotheses ]/ 0 = {the frrst symbol IS 0 } H 1 = {the
first symbol 1s 1 }
P (Ho) = 07, P (H1 ) - 03 P {X~! lH0 } = 0711 1
p {X~ a I HI} :== 0 3k f' then p {X~ k} = 0 7 0 7k i

+ 0 3 0 3k-1 = 0 7ff. + 0 3A
4 39 A dev1ce cons1sts of m units of type I and n units of type II
The reltablllty (a fallure free performance: for an ass1gned t1me 't) of
each type I untt 1s p 1 and that of a type II untt IS p 2 The units fad
lndependently of one another For the devtce to operate any two type I
nn1ts and any two type 11 units must operate simultaneously lor time 1
Find the probab Il1 ty !P of the failure free opera t1on of the de' IC9

•>
B) the formula for the complete expectation 've seek the gecond moment
ahout 1..h¢ or1g1n rather than the variance It~C~elf (~toce condztronal expectations
3.re d zff eren t i or d tlf eren t hypotheses)
· Ch. 4. Discrete Random Variables 105

Solution. The event A = {failure-free operation of the device} is


• •
a product of two events:
'
A 1 = {no less than two out of m type I uriits operate
without failure};
A 2 = {no less than two out of n type II units operate
without failure}.
The number X 1 of type I units operating without failure is a random
variable distributed according to the binomial law with parameters m
and p 1 ; the event A 1 corresponds to the random variable X 1 assuming
a value no smaller than 2. Therefore
P (A 1 ) = P {X 1 ;:::: 2} = ·1 - P {X 1 < 2}
= 1 - P {X1 = 0} - P {X 1 = 1}
= 1 - (qf.' + mqf_'- 1p 1 ) (qi = 1 - Pt).
Similarly, P (A 2 ) = '1 - (q~ + nq~- 1 p 2 ) (q 2 = 1 - p 2 ).
The probability of a failure-free performance of the device
& =P(A)=P {A1)P(A2) = l1-(qf.'+mqT- 1Pt)1 I1- (q~+nq~- 1 P2)].
4.40. An instrument is used (activated) several times before it fails
(an instrument that has failed is not repaired). The probability that
P(k)
1

0 1 2 J
Fig. 4.40 Fig. 4.41

having been used k times an instrument does not fail is P (k). The func-
tion P (7,·) is given (see Fig. 4.40). The instrument is known to have
been activated n times. Find the probability Qm that it can be activat-
ed another m times and the mean value of the number X of future
activations.
Solution. Qm is the conditional probability that the instrument
will not fail when activated the next m times provided that it has not
failed the first n times. By multiplication rule for probabilities p (n +
m) = P (n) Qm, whence
00

Qm= P (n+m)
p (n) ' l\1 [X]= ~ mQm-
m=l
i06 Applied froble m1 111 Pro babiluy Th f!Crfl

4 4 t A 'vorh.er operates n machtne tools of the same type '"h1ch are


located along a stratght llne v.1th 1nter~als t (F1~ 4 41) From t1me to
t1me the tools stop {n 1th equal prob a b 1l1 t l and 1ndependen tly of one
another) and must be adJusted Jfav1ng ad1uste(l one tool the "orker
sta} s put un ttl another tool stops then he goes t c) that tool (If the same
tool fails he sta:,. s put) A random vartable X 1s the dtstance the
'\\Orker co\ ers bet'" een t" o adJustments Find the mean value of the
random 'arJable A
Sol ut1on '\ e apply the complete e"\ pee tat ton :formula 'v 1th hypothe
SIS II t - {the '\ orker stays at the tth tool} (l :=:: 1, n) Accordn1g
to the C()UUltlons of the problem all the hyp~theses are equally prob
able P (H1 ) = P (H 2 ) ~ ~ P (lln) = 1/n Let us find the
condtttonal e'tpect1t1on l\1 [X lll 1 for the lth hypothests Each tool
<~th tnclust' e) stops "tth probabllitJ t/n hence
l n
~I[X Jfld=*[ ~ (&-k)+ ~ (k-t)]= 2~ ll(t-1)
k:;;;;;:.;:l ll={
-1r (t.t.- l.\ \~- l. ~ t \'
By the complete e"t pect a t1on formula
n n

l\I lXJ = ~ P (H 1) l\1 [X 1 Hi]=" ~ L f 1(l- t) +{n-') (n-t+1)}


LJ n 2n
i-t f=1
Yl n

= 2 !2Lh t(t-1)+~ (n-l+1)(n-r)J (441)


't-1 i~t

n
Usmg the famllu1r formula ~ ,2 = 01
(n....L. ~~<2 rl-+ t} we calculate
't~i
'!\
~ l (~ -1) = n (n+t) (2n+ 1) ~ n (n+t) _ n (n+ 1) (n-1)
~ 6 2 - 3
1 :;;;;;;;![ t
Using th1s formula to calculate both sums 1!1 (4 41) '' e make su~
that the) are equal and
n
n (nj- 1) (A-".11.= l (n ~ 1)
1
}tl{X} = l "V 1 (l-i) = l
n~ LJ n2 3 3n
'i~t

4 42 A number of Independent tr 1als are made 1n each of ,,h1cb


an event A may occur \V1th probab 1lrty p the trtals are conttnued until
the event A occurs after '\\ h1ch they are term11la'ted A random vatl.a
ble .l IS the number of trli1ls Construct 1ts ordered ser1es and frequency
polltgon dertve formulas for \tS mean value valiance and mean square
dev1a t1on
Solut1on The random variable X has a. geometr1c di.strihutron
begtnntng \Vlth unit~ {see (4 0 32)] The ordered ser1es of the variable X
Ch. 4. Discrete Ra ndo m Variables 107

has the for m


1J 2j 3j ... j m J ...

X: 1
P J qp J qZp J " .J qm -I p J " ·

Th e fre qu en cy po lyg on for p = 0.6 is sho wn in


where q = 1 - p.
Fig. 4.42.
00 00

M [X ]= ~ mq m- 1p =p ~ mqm-1,
m= 1 m= i

no te th at mq m- l is the de riv ati ve of qm wi th res pe ct to q, he nc e


We
00 00
00

~ mqm-1 = ~ ddq qm = :q .2} qm.


m= l m= 1 m= 1

t su m is the su m of the ter ms of an inf ini tel y de cre asi ng ge om et-


The las
ric pro gre ssi on wi th a co mm on
rat io q; su mm ing it up , we find
tha t 0.6
~
00 \
0.5
"V mqm- 1 - d { q ) - ~1---:n \
LJ - dq 1- q - (1- q)Z ' \
rn= 1 0.4 \
(4. 42) .._r::J.. \
~ O.J \
whence M (X ]= p/ (1 - q) 2 = 1/p.
II
Q.'E. 0.2 \
The va ria nc e of the ran do m \
\
va ria ble X ca n be fou nd in ter ms 0.1
of •its• sec on d mo me nt ab ou t the
on gm
00 0 12.14
...............
liG
" ....... -
m
a 2 [X ]= M [X2] = ~ m2qm-t p. Fig. 4.42
m= t

e it, we mu lti ply ser ies (4. 42 ) by q, wh en ce we ge t


To cal cu lat
00

"V q m
LJ mq = (1 -q r.
m= 1

Di ffe ren tia tin g the ser ies , we ob tai n


00

~ m2 qm -t= i+ q
L..J (1- q)3 .
m= 1

Mu ltip lyi ng thi s ex pre ssi on by p = 1 - q, we ge t M [X 2] = (1 +


- q)
2
Th e va ria nc e of the ran do m va ria ble X is [se e (4. 0.1 7)]
q)/(1 •

Va r [X ]= 1\I [X
2
+
)- (M [X])2 = (1 q) /(1 -q )2 -1 /(1 -q )2 = q/p2,
U:~:= V Va r [X ]= 1fqjp.
i08 A pplu:d Probl~ms In Pr ob ab iht y Th~ory

Thus, for a ge om etr ic d1~tribut1on! he g1 nn 1n g \V Ith un tty


1\ I(X J= 1/ p, Var{XJ~qlp 2 t a-=Vq/p
Th e ra nd om "a rt able Y = X - 1 , 'v h tch has a geom
t1o n Leg1nntng "1 th a ze ro , pos~esses th e ch ar ac ter
etr 1c d1stnhu
Js ttc s
1\1 [YJ = l\1 IXJ -1 = q/p, Va r {Y} = Va r (X }= ql p2 , o
11 = "Vq/p
4. 43 . Se \ er al tri als arc ne ed ed to tu ne up an tn trt ca te
c1rCu1t The pr ob nb 1lt ty th at th e c1rctu t Will be tu ne d tn
electron1e
the first tnal
lS P 1 , 1n th e se co nd P , In th e h.th tri al P",.
2 Th e probab1ht1~s
P 1 I> 2 , Ph , Pn are g1 ' en Af ter th e nt h unsuccessful tr1al to
tu ne up th e c1 rcu tt, tho trtal~ are tcr m1 na ted F1nd th
e me an va lue and
th e \ dr11nce of th e ra nd om v1 r1a Ll e \", '\ h1ch ts th e
of trt als to tal number
Solut1on Th e or de red ~eru!s of th e ra nd om va rla bl o X
has the form

X tf 2 f lk { I n
pl I P~ ~ ph I f t - ~ ;:: pll t
t

1\I {XJ == L:~=! + n [1 - l:i!:.~ Ph l,


tP i

Va r [l J = ~1 [X 2 ] - (l\1 [X )) 2 = L~-=l t P + n I 1 - ~i;f P 1 ] -( 'I (X])


2 2 2
1

4 44 An In str um en t ~~ asst?mbled from J.., 1) pes of pa rts 1ncludtng mt


A
pa rts of ty pe z (~ ~ 1, ( ~ m 1 = m, m be1ng the total
, A.)
i~1
nu mb er of part~) Th e pr ob ab 1l1 ty th at a p1 rl of ty pe
, tah en at random
ha s a de fec t 1s q 1 Th e In str um en t op er ate s on ly "h en no
ne of th e parts
are do fec ti\ e (1) Fu td tho pr ob ab JlJ ty P th at th e In str um
en t" 1ll operate,
(2) find th e prob1b1l1 t} R 2 th at th er e 'v1ll he no less
ttv e part~ tn th e tn slr um en t th an t'\\oo defcc
So lu tio n (1) P IS th e pr ob ab ili ty th at Jn m In de pe
nd en t tri als the
h
e' en t A = {a pa rt Itii de fc ctt \ e} 'v1ll no t occur P = l J
(1 - ql)mt ~
P~, (2) R 2= 1- (P 0 +P d \\h er e P JS th e pr ob ab Jll i= l
ty th at no ne of
0
th e m pa rts are defect1ve~ P 1 1s th e pr ob ah th ty th at th
on e de fec tiv e pa rt P 1 ca n be fo un d as th e su m of
er e 1s e""tactly
pr ob ab tli tle S tha t
th ere lS e" \ac tly on e de fec tlv e pa rt am on g th e m pa
ot he r pa rts he tn g so un d, 1 e 1 rts of ty pe 1, the

Pt = mt ql (1 - qj)ml-1 II
1¢ 1
(1 - qJ)mJ

+m zq 2( i-q 2) m .-1 IT (1 -q J) m J+
J~2
A
= L m, qt (t - q,Jmi 1 II <t - qj)mJ
i-1 J~l
Ch. 4. Discre te Rando m T'ariables 109

4.45. A total of k messages are sent over a comm unica tion chann el
which conta in 11 1 , n 2 , • • • , nk binar y symb ols (0 or 1) respe ctive ly.
The svmbols assume the value s 0 and 1 indep enden tly of one anoth er
with proba bility 0.5. Each symb ol may be disto rted (repla ced by the
opposite value) with proba bility p. When the messages are coded , a code
is used which can corre ct errors in one or two symb ols (prac ticall y
with a comp lete trustw orthin ess). An error in at least one symb ol (after
the correction) make s the whole message erron eous. Find the proba -
bility R that at least one of the k messages will be erron eous.
Solut ion. The rando m varia ble Xi defined as the numb er of incor -
rect symbols in the ith message, has a binom ial distri butio n (4.0.23)
with param eters ni and p. The proba bility that the ith message is erro-
neous is equal to the proba bility that no less than three symb ols of the ni
symbols in the message are incor rect:
2
P {Xi ~3} = 1-P {Xi< 3} = 1- .ljl0 C~.pi
l
(1- pti-i .
3=

The proba bility that at least one of the k messages is erron eous is
k 2
R = 1- I1 2j C~.pi (1- pti- i.
1
i=i i=O
4.46. Four ident ical parts are neede d to assemble an instru ment ; we
have ten parts at our dispo sal, six of which are sound and four are
faulty. All the parts are ident ical in appea rance . Five parts are selec ted
at random (one extra part is taken "just in case"). Find the proba bility
that no less than four of the five parts are sound .
Solution. The rando m varia ble X is the numb er of sound parts amon g
the five chosen and has a hyper geom etric distri butio n with para-
meters 5, 6, 4 [see (4.0.34)]. The proba bility that m of the five parts are
sound i"- P m = CmC 5 -m/C 5 • hence
6 4 10'

P~ = 5/21, P 5 = 1/42, P {X~ 4} = P 4 +P 5


= 11/42.

4.4i. There are seven radio valve s, three of which are fault y and
indist ingui shabl e from the other s. Four valve s are taken at rando m
and are screwed into four holde rs. Find and const ruct (as a frequ ency
polygon) an order ed series of X radio valve s which will funct ion
Find its mean value , varia nce and mean squar e devia tion. ·
Solution. The varia ble X has a hyper geom etric distri butio n with
parameters n = 4, a = 4, b = 3.
Pt=C~C;/c;=4/35. P 2 =C~q;c;=18/35,
P 3 = c~c;1c; = 12/35~P,. = C!!c; = 1/35.
The ordered series of the rando m varia ble X has the form
X· 1 I2 ! 3 I
4
• 4/35l1 SJ35l 12/35 l1J35 .
110 Applied Problem1 fn Pra biJbtl {ty Thl!o rv

The frequency polygon of the random var1ahle X IS g1ven 1n F1g 4 47


U.s1ng formulas (4 0 35) and (4 0 3o), 've calculate !\I (X} ~ 2 28
and Var [XJ ~ 0 38
4 48* A number of 1ndependent trtals are made In each of '\\hu~h
an event A occurs with probabtltty p Our a1m is to obtatn the e\ent
A k ttmes The maximum poss1hle number of trials lS n (\\ tth n ~ 2k)
The trials are terminated etther when thee' ent A occurs k t1mes or when
1t ts clear that the event cannot occur /.. times., 1 e when the oppos1ta
P,
(JJ

~
02
I
I
J j

'
1B!J5r
J
1265!
r
01

I 1/Jr I
D 1 z 4 Xt n 1 2 J 4 5 6 1 z
r,g 4 47 F1g. 4 49
event A occurs n - /., + 1 ttmes A random vartable X ts tbe number
of tr1als that wtll be made F1nd the distr1but1on of the random var1..
able X
Solutton~ The poss1ble values of the random variable X are h, A. +
1, , n - 1, n Let us find the corresponding prohabtltttes Pm :::::
P {X= m)
For k ~ m ~ n - k the tr1als w1ll be terminated after the mth trial
because the event A has occurred k t1mes Therefore,
Pm = P {as a result of m - 1 trtals the event A occurs k- 1. ttmes
and the mth tr1al y1elds the event A} =
(4 48 t)
ln particular, form = k we have Pk = p'lt
For n- k < m~ n the trtals can be terminated after the mtb
tr1al e1ther because the event A occurs for the ktb t1me or because thE'
-
oppos1te event A occurs n - k + 1 ttmes The 'Ptobabtltty of the first
var1ant has been calculated, the probability of the second var1ant _.IS
P = {the first m - 1 tr1als yteld n - k occurrences of the event A
and the mth trial llelds the event. A.\ =
~:~ptn-t-n+kqn-lq = C~:~pm-1-n.+ILqn-A.tl
Addtng the probabtlities of the two variants together, we obtain
Pm = c~-_t1 pttqm-tt + C~~~pm-t-n+kqn-'u·l~ (4.48 2)
Ch. 4. Discrete Random Variables 111

Thus 1 for k :::;:;;_ m :::;:;;_ n - k the distribution of the random variable X


1

is given by formula (4.48.1) and for n - k. < m :::;:;;_ n by formula (4.48.2)


1

4.49. Under the conditions of the preceding problem n = 7 1 k = 3


and p = 0.4. Construct the ordered series of the random variable X.
Solution. The possible values ofthe random variable X are 3 14 1516 17.
p3 = p 3= 0.43 = 0.0641 pq = C::ip 3q4- 3 = 3· 0.43· 0.6 = 0.1151
+ +
Ps = c;:ip 3q5-3 c;::~pS-t- 7 + 3 q 7 - 3 +1 = 6. 0.43· 0.6 2 0.6 5 = 0.2161
P = Ctip q
6
3 6 3
- = 10· 0.4 0.6 + 5 · 0.4. 0.6 = 0.293,
+ C~:~ps-t-7+ q - +
3 7 3 1 3• 3 5

P7 = c~:ip q - + c;:~p -1-7+ q -a+t = 15. o.4 3· o.6 4+ 15. o.4z. 0.6 5=0.312.
7 3 7
3 7 3

X· 3 I 4 I 5 I 6 I 7
. 0.064j0.115j0.216I0. 293I0.312.

The frequency polygon is given in Fig. 4.49.


01 \PT.ER-..5

Contmuous and 1\hxed


Rnndon1 Vnr1ahles

of the po~tble values of a drscrete random 'ar1able 1s fin1te and


$. 0 Tht •e 1 ocd!screte rande:m. 'Vatla hle 's cha.r<lct-e-tt'l.ed by an unt.ountabl~
c ~t.tblf \) 1 :a1uPS Examples of nond1screte random vari::thles .are the detection
~\ (\f r,~l e3r the ttm~ "' tra1n 1s late and the error 1n me-asur1ns;r an angle \\Jth
n.e~ of n
ill~ trJctor
ra4h sets of the posstble valuts of all these random var1ables Js uncount
e conttnU{)U~Iy fill a c~rtain sechon along th.~ absc1~a axJ!
a h \t.l" c! DC't'"bl he 1Yhat the d 1~ t n but 1on fun c t 1on of a random v a r1a b1e X u defined
fimf'm er
u~ F (.t) == P {X < x} (50 l)
distr 1Lu (ton function e.x:1sts for an) random vartable \vh.ether discrete
Jh~nce a
«: non:~~r::~,butwn functiOn F {.r) of a random vartol.hle X ts contmuous for any
If l e ddltJOD hJs a der1vatl\ e F (;r) e\erynherc except maybe at 1Dd1Vtdual
$ant !0 anntnt!! (F1g 5 0 f) then the random varlable X 1s s~ud to be continuous
f.1r1: ('11' Jt y-..-

F(z)
I -------- Ffr
f --~

0 X z, 0

rag 50 2
ut1on funchon F (x) continuously Increases on certa1P. 1ntervals but
lhes at pa.rt1cular potnts (Fig 5 0 2) then the random \ar1able I~
The function F (x} 1s continuous on thu left for e1ther a mtxed or
:lorn varrahle
l ty of a particular- lndlvidual t-alue of a co nhnuous random tanablt
ro The probablltty of a part1cular tndrvidual value of a mJ.xed random
"nhfcb. h~s on the 1nterval of continuity of F (x) 1s also zero "hi1e the
uty of each of the values z 1 z~ at 'vh1ch. the function F (.r) JUmps ts
~'rteally equal to the value of the corresponding JUmp
Fer any random vanable (dtscrete conttnuous or mtxed) the probability that
1t \\,U fall on the Interval of the absc1ssa Axis from a to ~(Including a hut excluding
P,) lS -ex. pres.se d by the -{ orm-ula
P {(%~X< ~J = F (~}- F (ct) (50 2)
Jillt't!' E-(Av ~ T.l:J., =- ,P 1Ur tC c:unnnuou:r ranabm va.r1adi~ tl:le equal \ty·~ngrr .J~
(5 0 2) can be neglected
P {a < A < ~} ~ r (~} - F (o:) (50 3)
or ustng another notatton,
P {X E (et, ~}} ;:: F (~) - F (a) (5 0 4)
Ch. 5. Cont inuou s and Mixe d Rand om TTariables 113

The proba bility densi ty (or distr ibuti on dens ity, or dens ity fun.ction) of a conti n-
uous rando m varia ble X is a deriv ative of the distr ibuti on funct wn:
I (x) = F' (x). (5.0.5 )

A proba bility eleme nt for a conti nuou s rando m varia ble X is ~he quanti~y
f (x) dx whic h is appro xima tely the prob abili ty that the rando m vana ble X \Vtll
fall on' the elem entar y inter val dx adjoi ning the poin t x:
I (x) dx ~ P {x < X < x + dx}. (5.0.6 )

The densi ty 1 (x) of any rando m varia ble is nonn egati ve (f (x);;;;. 0) and possesses
the prope rty
co

) I (x) dx= i. (5.0.7 )


-co

The graph of the dens ity I (x) is know n as a distri butio n curve .
The proba bility that a conti nuou s rando m varia ble X falls on the inter val from
a to p is expressed as
~

P {a< X< P}= Jf


a;
(x) dx. (5.0.8 )

The distri butio n funct ion of a conti nuou s rand om varia ble X can be expre ssed
in terms of its densi ty:
X

F (x)= ) I (x) dx. (5.0.9 )


-co

The mathe matic al expec tation of a conti nuou s rando m varia ble X with dens ity
I (x) is its mean value , whic h can be calcu lated from the form ula
co

M [X]= i xf
-co
(x) dx. (5.0.10)

The math emat ical expe ctatio n of a mixe d rando m varia ble with a distr ibuti on
unction F (x) can be calcu lated from the form ula

M [X}= ~ xiPi + J xF' (x) dx, (5.0.11)


i (c)

'':here the sum .extends over all the point s of disco ntinu ity of the distr ibuti on func-
tbwn and the mteg ral over all the inter vals of its conti nuity . Whe n .M [X} has to
e denoted by one letter , we shall write M [X] = m .
The varia nce of a conti nuou s rand om varia ble is X
0

Var [X} = M [X2] = M [(X - mx)2)


and can be calcu lated by the form ula
co

Var [X]= } (x-m x) 2 1(x)d x. (5.0 .12)


. -co
'

8-057 5
ft.\ Applfed Probltml in Probab!llly Thtoru

The var1ance of a m1.xed random variable Is expressed l.ly the formula

VarIX]= h (zt-mxr• PI +
1
J
(e)
(z-msP F (z) d:r, (50 t3)

where the cum extends ovPr all the points of dt~conttnu1ty of the function F (z)
and tho Integral over all the Intervals of 1ts ronhnUllY
The square root or variance is kno'\\ n as the mean squarf: dtt,atfon oC a random
var1ahle
o.s: =VVar [XJ (:> 0 f~)
The cot/ ficient of vartatlon
(50 15)
Is so met1m es used as a. cbara c ter Is lt e of the degree of rand oIllness of a nonn egat LVt
random vanahle
Note that the coefficient of 'ar1at1on depends on the or1g1n of the random van
able
The mean square de-v1at1on can be used to approxtmate the range of the po.s~able
values of a random variable This appro:r.tma t1on 1s knO\\"D as the thrte ligma rol,
wh1cb states that tlte rangl of the pract(caltv possiblt: talues of a random taria.6lt X
ll.es wtthtn the l fmfts
m;c ± 3ar (50 t6}
Th1s rule 1s alQo vahd fer a d1screte random var1able
The 1.. th moments about the or1g1n for a C()n t1nuous and for a m1x~d random
var1ahle
all [X] = l\1 (X~]

are expres.sed by the formulas


lXI

I a.11IX) = } zfcj (z) dz, (5 0 17)

a,. [X)=~ z~p.+ ) rJcF (z) d:r (5 0 18)


t (c)
respectJvely
The central moments can be calculated from s1m1lar formlllas
00

I'll [X]= J(z-mx)llf (x) dx (S 0 19)


-aa

JlAIXI""" h (:r1-m..-)1t Pt+ J(z-mr).ll F (.!") d:r (50 20)


i (c)

The central moments can he expre~~ed 1n terms of the moments about the orJgtn
as was the case for a dtscrete random varJable (see Chapter 4) The e-,;:presston for
var1ance 1n terms of the second moment about the or1g1n 1s of especJal practicalln
terest
Vat IX] -= as \Xl - m~ (50 'Z\}
or 1n another notatlon
Var (X)=~~ (XtJ- (~t {X])' (50 22)
If t be proba h1l tty oi an event A depends on th-e , al ue assumed by a con t 1nuous
random variable X Wlth density I (z)t then the total probab1hty of the event A
Ch. 5. Continu ous and .Mixed Random Variable s 115

can be calculat ed from the integral formula for~total probabil ity

P (A)=
-J P (Aix) f (x) dx, (5.0.23)
-oo

where p (A 1x) = P tA 1 X= x} is the conditio nal probabi lity of the event A


on the hypothe sis {X=x}. .
Bayes's formula also bas an analogu e for contmuo us. r~ndom vanabl es. If an
·
event A occurs as a result of an e:l...periment and the probabi lity of the e~e?-t depends
on the value the continuo us rando~ variable X assumes , then the cond1t10?-al prob-
ability density of the random vanable , the occurren ce of the event A bemg taken
into account, is ( • • )
fA (x) = f (x) P (A I x)/P (A), 5 0 24

or, with due regard for formula (5.0.23),


00

fA(x)= f(x)P(A \x)j ~ P(Alx)f (x)dx. (5.0.25)


-oo

This formula is known as Bayes's formula.


There is also a~ integral formula for total expectat ion for continuo us random
variables, i.e.
00

l\I [X]= 1
-oo
M [X IY] f (y) dy,i (5.0.26)

where X is any random variable ; Y is a continuo us random variable with density


f (y) and l\1 [X 1 y] is the conditio nal expecta tion of the random variable X, pro-
vided that the random variable Y has assumed a value y.

f(x)
1
b-a
-If--

0 a
Fig. 5.0.3 Fig. 5.0.4

Let us con~ider the distribu tion and properti es of continuo us random variable s
that are often encount ered in practica l applicat ions.
1. Uniform distribu tion. A random variable X has a uniform distribut ion on
the interval from a to b if its density on that interval is constant : .
x) = { 1/(b-a} for x E (a, b),
f (
0 for x ~ (a, b). 5 0 27
<· · )

' The values off (x} at points a and b are not defined, but this is inessent ial since the
probabil ity of falling in any of them is zero and, hence, the probabi lity of any event
related to the random variable X does not depend on the value of the density f (:z:)
a.t the points a and b*). The graph of the probabi lity density of a uniform distribu -
tion is shown in Fig. 5.0.3 •
••

. *l Fro.m now on, when we define the density f (x} by differen t formula s on
dhtfferent mterval s of the x-axis, we shall not indicate the values of 1 (x} on the
J oundaries of the interval s.
8•
t16 A pplled Problems In Prob11btltty Theorv

The mean value \nr1anee and mean square deviation of the random vartable
X wh1ch has denszty (5 0 27) are

1\I[X]- atb Var [l) =J.b-;-2a)~ • 11::c=


b-a
2 Ya ' (5 0 28)

respe ctt vel y


A unt{orm. d 1st r Lhut 1on is tn h~ren t tn the measure m~n t errors VI hen w tn g an 1n
strument nlth large d1v1s ons when the value obta1ned is rounded off to the near-
est 1nteger (or to tha nearest smaller number or nea~st larger number} For tnstance
the error (1 n cen t1 metres) '' h ~ n me a.sur1ng the 1ength of a pencil \VI th a ruler Wl th
em diVISlOns has a un1form distrihubon on the Interval {- 1./2 f/2) if the value
1s tO-uuded ofi to the ne tu:e~ t 1 u te g~r 3nd on the 1n terva l (0 t) 1. Eit 1! round~d down

xMy r
50.5

to the nearest smaller value The error (ln mtn) made usJng a watch w1th a JUlD.P1ng
mtnute hand also has a uniform dJstrtbutlon I the 1ntervttJ (0 1)] The rota.tton angle
«<> of a well balanced ''heel (Fig 5 0 4) has a unuorm dt~trJbutton on the tnterval
(0 2~) 1f 1t 1.s set 1n mot1on and stops by Ir1chon In Buffon s needle problem (se~
<pr{) bl~m i 45) the angle 1\ h1ch defines. the d1reet1 on oi tb. e needle also ha. s a u~dlJtlU
d1.strtbnhon s1nce the needle ts thrown at random .so that none of the values of tp
ts nl.ore l1kely
TyJn ca1 cond 1t 1on.s for the occurrence of a unt£orm dlS tr1 h u t1 on are the fo 11ow t ngl
a point At 1s thro\\"D at random onto the :r BXls d1v1ded tnto equal tntervals 0
leugth t (F1g 5 U 5) Each of the random Intervals X and Y Into wbich the po\ut \!
d~v1des the Interval 1n \\"'htch 1t has fallen has a uniiorw. distr1huuon on the Inter-
val (0 l)
From no'v on we shall often use a br1~Ier notation
f {:r) - 1./(b- a} fot 4%. E {a b) (50 29)

for prohahliity denstty rather than the more detatled notatiOn as tn (S' 0 27)
2 Exponentla! d1stribuhon A random vanable X has an exponentfal dutr~b'lJ
hon 1f its dens1ty Is defined hy the formula

Ae:- 1 ~ for z 0> 50 30)


I (z)-===
0 for z<O
where ,. ts the pa ram~te 1' of the e XJ~G neu l1at d u~tr1 but1o n {F lg S 0 6) An e xpa nen t1 al
dtstrlbuhon ts e.spec1ally significant 1n the theory of ~farkov processes and the queue
1ng theory (see Chapters tO and it)
If there 1s a .s1mple Dow With 1ntens1ty A. on the tune a.z:ts Ot (see Chapter 4)
th@n the t1me 1nterval T between two neighbouring events has an exponentlal
dtsttl button " 1th pat"ame ter ~
TlJe mean ._ralue va.r1ance and mean square devtatton of the v3-r1able X wh1cb
.has an e :z:ponent ial d 1st.rtbuti on are
l-rl [X] = t/~, Var lXI = 1/'-.1 , a$ = 1./'A (5 0 3i)
1" eqec\1 vel Y The coeliic1en t ot var1a tton of an expo nen ttal d 1s tr1but 1on 1s un1 t T

v:c -- a:)m~ == 1
Ch. 5. Con tinu ous and Mix ed Ran dom Var iab les 117

ll ofte n rep lac e the mo re det aile d not atio n for the exp one ~tial dis trib u-
We sha
tion (5.0.30) by a briefer one, i.e.
f (x) = ?.e-1..x for x > 0
or the even briefer not atio n
f (x) = M- ).x (x > 0).

A table of the fun ctio n e-x is giv en in Ap pen dix 3. . . .


l dis trib uti on. A ran dom var iab le X has a nor mal dzs tnb utw n (or has
3. Norma
a Gaussian pro bab ilit y dis trib utio n) if its den sity
(x-m )2
1 -
f(x )= , r - e 2o2 (5.0.33}
cr v 2n

The . me an val ue of a ran dom var i~b l~ w~i ch has this dis ~r~ but ion is
(Fig. 5.0 .7). 2, and the me an squ are deviatiOn IS cr. The pro bab ilit y tha t a
m, the var ian ce IS cr

f(x )

o· 0 m

Fig . 5.0 .6 Fig . 5.0 .7

le X, wh ich has a nor ma l dis trib utio n wit h par am ete rs m and a,
random var iab
will fall on the inte rva l from CG to ~ is expressed by the formula
a.-a m) '
(5.0.34)

where <I> (x) is the erro r function:


X t2
,.... ) 1
w (X = ,r -
j e- 2 a• (5.0.35)
1' 2n
0
(- x) =
The error fun ctio n has tbfl following pro pen ies: (1) <I> (0) = 0; (2) r <I> fun ctio n
- <I> (x) (an odd fun ctio n); (3) <I> ( oo) = 0.5 . The tab les of the erro
are given in Ap pen dix 5.
If. th~ inte rva l (a., ~) is sym me tric abo ut a poi nt m, the n the pro bab ilit y off al-
mg m It
P {IX - m I < l} = 2<1> (lla ), (5.0 .36 )
where l = (~-a.)/2, i.e. hal f the len gth of the inte rva l.
trib utio n aris es wh en the var iab le X res ults fro m the sum ma tion of
Normal dis iab les wh ich are
a large num ber of ind epe nde nt (or we akl y dep end ent ) ran dom var
~hmparable as concerns the ir effect. on the sca tter ing of the sum (for mo re det ail see
0 and a = 1 are giv en in Ap pen dix 4.
ap. 8). Tab les of the nor ma l den sity for m =
tiS Applied Problema tn Probabtllty Thtorg

Proble1ns and ExerctSes


5.1. \Ve are g1ven an ath1trary value of an argument (1) Can th~
dtstr1bu t1on funet ton be greater than un1 ty? (2) Can the probab1hty
denslty functton be greater than un1ty? (3) Can the dlstributLon tunctton
he neg at 1ve? (4) Can the probahtl1 ty density function be negative~
Answe~~ (1) No, {2) yes, {3) no~ (4.) no
5.2. \Vhat ts the dtmensJonality of (1) tho distribution functront
(2} the probabtltty de.nstty functtont (3) the mean value; (4) the variance,
(5) the mean square dev1atton; (6) the th1rd moment about the or1g1n?
Ansv.:er~ (1) Dtmenstonless~ (2) the tnverse of tho d1mens1on of the
random variable~ (3) the dtmenston of the random var1ahle. (4) the dt
mens1on of the square of the random vartablet (5) the dtmens1on of the
random vartahle, (6) the d1menston of the cube of the random variable

F(:r)

~~ ~~ J, t1 r.r;
..,..
fz (:r)
r,rz;/
'I
I
I
I
j
J
_,./

F1g 54

5113. G 1ven the graph of the probabtli ty denst ty function I :x (x) of


a random varlable X (F1g 5 3), construct the probab1ltty density
function fv (.x) of a. random variable Y = X +a, where a ts a nonran
d om v artable \Vr1 te t l1e ex press 10n for f 11 (z)
Soluhonlt The dtstrtbutton curve of the random var1able Y ts the
same as the curve ot distribution of j $ (x) but sh1fted to the r1ght by a,
1 e , f v (.z) = I:.: (x .._ a)
5.4. Given the graph of the dtstrtbutton functton F x (.x) of a random
variable X (F1g 54), construct the dtstrtbution function F v (x) of
tbe random var1able Y = -X.
Solut1on~ F 11 (x) = P {Y < x} = P {--X< z) = P {X> -x} =
1 --- P {X< -x} ~ 1 - F~ (-:.t)
To construct the graph of the funct1on F (-x) 1 Vte must cons1der
the m1rror reflect\on of the. curve of F:r (x) a~ou.t the axts of ord1nates
and subtract each ordtnate from un1ty (see tho dash l1ne ln F1g 5 4)4r
5 . . 5 .. Construct the d1.str1bUtlon funct1on F {:t) f-or a -random var1able
•~ which has a uniform distribUtion on theltnterval (at b).
Ch. 5. Continuous and lJ!i:ced Ran dom Variables 119

Sol uti on.


:!:

F (x) = ) f (x) dx,


"
-oo
r
0 for x< a,
f(:c) = 1/( b- a) for a< x< b,

0 for x> b,
0 for x~a,
F( x) = (x-a)l(b~a) for a<x~b,
1 for x> b
(Fig . 5.5 ). , .
5.6 . Th e ran do m var iab le X is dis trib ute d acc ord ing to the 'ng ht
n
tria ngl e law " in the int erv al (0, a) (see Fig . 5.6 ). (1) Wr ite the exp res sio
f(x )
F(x)
1 -- -- -- -. ... -- -

0 a :c
o a b

Fig . 55. Fig . 5.6

c-
for the pro bab ilit y den sity fun cti on f (:c); (2) find the dis trib uti on fun
tion F (x); (3) fin d the pro bab ilit y tha t the ran do m var iab le X wil l fal l
on the int erv al fro m a/2 to a; (4) ftnd the cha rac ter isti cs of the var i-
able X: mx, Va rx, ax, ~t 3 [X] .
An sw er.
2( 1- x/a )/axE (O ,a) ,
for
(1) f(x )= { o·
j for x(f (O ,a) ,
or more con cis ely f (x) = 2 (1- x/a )/a for x E (0, a),
0 for x~O,
(2) F (x) = x (2 - xfa )/a for 0 < x~a,
1 for x> a,
(3) P {X E (a/2, a)} = F (a )- F (a/2) = 1/4,
(4) fnx =a /3, Va rx= a 2
/18 , O"x=a/(3 V2}, by for mu la (4.0.15)
!-La [XJ = a.a [X] - 3m xa 2 [X ] +
2m i = a 3/135.

5,7. A ran dom var iab le X has a Sim pso n dis trib uti on (ob eys the
"l.aw of an isosceles tria ngl e") on the int erv al fro m -a to a (see
Ftg. 5.7a). (1) Fin d the exp res sio n for the pro bab ilit y don sity fun cti on-,
120 Applied Pro~lem• fn ProbtJbtlUg Thtorv

(2) construct a graph of the d1strtbut1on function, (3) find mx, Varz
ax . ~~ 3 I X] (4) find tl1e prob abd 1ty that tb e random vttria blo X '"Ill fa11
in the 1n terv a 1 (-a/2 a).
Answer (1)

~{1-.;.) for D<~<a,


f{x)= --l-(1+.;) for -a<z<O,
0 for :t<-a or z>a,

or. more conctsely f

/(x)=~{ 1- 1
:1) for z E(-a. a),

(2) for :r E (-a a) the graph of the distribution funct1on lS formed


by t'vo parts of the parabola (Ftg 5 7b)
(3) mx = 0 Varx ==: a216 Ox = a!Jfb1 lt 3 (.t\] = 0 i
(4) P {X E (~a/2 a) =. 7/8,
5 8 The random variable X has a Cauchy distrihullon, 1 e I (x) =
aJn (1 + x'~') (1) •1nrl the factor a (2) find the dtstrtbutJon. Junctton

F(~)
1
f(.r) I
1
1
I
1
J
a. 0 a ;;r:
fa) -a /) a
(6)
F1g 57
F (.x), (3) find the probability that the rando1n vartable X '"Ill full on
the Interval (-1, + 1), (4) find whether the numertcal cbaractertstiCS,
1 e mean value and varJance e.xtst for the random vartable X

Answer (1) a= } • (2) F (a:) - arctan z f + {-.


(3) P {-1 < X<
1} == 1/2, {4) the characteriStics mx and Varx do not ex 1st s1nce thi!
1ntegrals for them dtverge
5 9 A rand om v ar1ab le X has an exponential d1Str1bu tton "1tb parant
eter }1, 1 e f (x) == Jle JL~ for x > 0 (1) Construct the distributton
curve, (2) find the distribution function F (.x) and construct tts graph
l......... (3) find the probab111ty that the random variable X Will assume a value
smaller than the mean
Ch. 5. Continuous and Mixed Random Variables 121

0 for
Answer. (1) See Fig. 5.9a; (2) F(x)= _e-ll:c for (see'
1
Fig. 5.9b); 632
(3) mx = 1/11; P {X< 1/!1} = F (1ht) = 1 - e- 1
~ 0. ·
F(m}
f(x) 1 ------

0 0
(a) (b)
Fig. 5.9

5.10. A random variable X has a Laplace distributio n, i.e. I (x) =


ae->-lxl, where f.. is a positive parameter . (1) Find the factor a; (2) con-
struct graphs for the density and the distributio n function; (3) find mx:
and Varx•
Answer.
1
-e>-x for x~O,
2
(1) a= 'A/2, (2) F (x) = 1
1- 2 e-?..x for x> 0.
fhe graphs of the density and the distributio n function arc given in
Fig. 5.10a, b;
(3) mx = 0; Varx = 2/'),_2.
5.11. A random variable R, which is the distance from a bullet hole-
to the centre of the target, has a Rayleigh distributio n, i.e. 1 (r) =
Are-h'r' for r > 0 (Fig. 5.11).
f(x}
1 -----
a f(J:} f(r)

D 0
(a) (b) 0

Fig. 5.10 Fig. 5.11

Find (1) the factor A; (2) the mode QJ/t of the random variable R
i.e. tho abscissa of the maximum of its probabilit y density function~.
(3) mr and Varr; (4) the probabilit y that the distance from the bullet
hole to the centre of the target will be less than the mode.
~nswcr. (1) A= 2h2 ; (2) QJ/t = 1/(h lf2); (3) mr =aft JJ/2 = y
1fa/(2h); Varr=(4- a)/(4h2)=ak 2 (4-n)/2; (4) P{R<clt}~0.393-
122 Applled Problems in Probabllfty Theoru

5 12 A random "Variable X has a dens1ty / 1 (x) w1th probab1hty Pt or


a dens1ty f 1 (x} \Vlth pr0bab1.llty p 2 (p 1 + p, = 1) Wr1te expre.ssioM
for the density and d1strihut1on functton of the vartable X Ftnd its
mean value and var1ance
Solutaon Dy the total probabtl1ty formula w1th hypotheses 111 =
{the \artable X has a density / 1 (x)} and H 1 = {tho var1anee X has
a denstty f 1 (z)} '\ e have
F (z) = P {X< x} = p,F1 (.x) + p 2 F1 (x),
X !£:

"hue Fdx) = J fdx) d:& F 2 (x} = J/ (:r:} 2 dx

f (x)- F (x) = Ptft (x) + P2f'Z (z)


B} the I orm ula for the complete ex pecta t1on
oo roo

m:c-Pt \ xf.(x)dx+p 1 J:t/ (x)dx=p tn• +Pamz,,


2 1 1

'\vhere ml.. and m='• are the cxpect'lttons for tbo distrlbuttons 11 (z)
.and j 2 (x)
\\ e find the vartance from the second moment about the origtn
co

I x /a(z)dx-m~
OliJ

V'l.rx=P 1cttt+P'l.~1 -m~-p 1 ~ xtft(x)d.x+p1 2


-oo -co

"'\\here ct 11. and are the second momcn ts about the or1g1n for the
a.2.'l
.dtstrtbutlons / 1 (x) and I 2 (x)
5 13 Balls for the ball hearings are sorted by passtng them over t'vo
!koles If a ball does not pass through a hole of d1amote.r d1 but passes
through a hole of dtameter d 1 > d1, then 1t 1s accepted other'v1se 1t 1s
reJected The d1ameter D of a ball IS a normally distributed random
variable 'v1th charactertsttcs m(i = (d1 +
d 2)12 and ad = (d, - d1)J4
Ftnd the probabtltty p that the ball \Vrll be reJected
Solut.on The Interval (d1 " d 2 ) 1s symmetrtc about md Sett1ng l =
{ d~ - d 1)/2 and ustng formula (5 0 36) \Ve find the probab 1l1 t y that
the hall Will not be reJected
P{l D-m,d <(d,-d1)/2}=2C!> { 2~d 1 )
1
1

~hence

p = ~ _ 2Cl> r a-'~-:ai 1= 1-21ll ( :.rr~:i''l = 1- 2<D (2) = o 0455.


5 14 Under the conditions of the preced1n «"problem find the mean
square deviatlon ad of the. d1amete.r of the ball given th.al10 per cent of
all the balls are reJected ..
Ch. 5. Continuous and Mixed Random Variables 123

Solution. The probabilit y of a reject

p = 1- 2Q:J ( d~~dl ) = 0.1, Q:J ( d~~dl ) = 0.45.

Using the tables for the error function (Appendix 5), we find the argu-
ment for which the error function equals 0.45:
(d 2- - d1)/(2crd) ~ 1.65, <J'd ; (d 2 - d1)/3.3.

5.15. When a computer is operating, faults may occur at random


moments. The time T a computer operates until the first fault occurs
has an exponenti al distributio n with parameter v : cp (t) = ve-V 1(t > 0).
When a fault occurs, it is detected at once and the necessary correc-
tions are made within a time t 0 after which the computer begins to
operate again. Find the density f (t) and the distributio n function F (t)
of the time interval Z between successive troubles. Find the mean value
and variance. Find the probabilit y that Z is larger than 2t 0 •
Solution. Z = T t 0 ; +
ve-v(t-to) for t> t0, 1-e-v(t-to > t>t 0 ,
for
f (t) = F(t) =
0 for t<t0 , 0 for t < t 0 ,
M [ZJ = 1/v +t 0, Var [Z] = 1/vz, P {Z > 2t0 } = 1-F (2t0) = e-vto.
5.16. The time T between two computer malfunctio ns has an expo-
nential distributio n with parameter 'A, i.e. f (t) = 'Ae-i..t for t > 0.
A certain problem requires a failure-fre e computer run for a time 1:.
If a malfunctio n occurs during the time 1:, the solution of the problem
must be restarted. A malfunctio n is detected only in a time 1: after
tbe solution has begun. We consider a random variable e which is the
time during which the problem will be solved. Find its distributio n
and expectatio n (the average time needed to solve the problem).
Answer. The random variable e
is discrete and has an ordered series

8:
.. I 2"' I ... I iT I ...
P I pq I ··· I pqi-l I ··· '

where P=e-~"', q=1-p=1-e-~-c; M[8}='t/p ='te-t.-c .


. ~.17. Under the conditions of the preceding problem, find the proba-
bthty that no less than m problems (m < k) will be solved durittg time
t =h.
Solution. vVe designate as Pm, k the probabilit y that exactly m
pr?~lems will be solved during the time t = k1:. Pm. k is the prob-
a?Ihty that exactly m of the k time intervals 1: will be without malfunc-
t~ons .. The probabilit y that there will be no malfunctio ns during the
time mterval 'tis p = P (T >Jc) = e-~-r. According to the theorem on
lhe repetition of trials
1.24 Applied Problem$ in Probllbi llty Theory

The probab tltty that no less than m problem s wtll be solved lS

h If.

Rm tt = lJ P, ~ = }J Cio-t"-.: (1- e-~ 1 ) 11 1


-
1 .m t~m

-or more con ven ten tly,


m l
Rm k = t- Lj c1e-i1t (t - e-Af)lt- i
i-0

5 tS Prove th~t the dt~tr1but1on of


the tnterva ls betwee n the sucees
s1ve e' ents 10 a stmple Oo\v w1tb Jntenst ty A 1s expone ntial with par
ameter A.
Solut•on \Ve first ftnd the dtsltlb utton functto n F ~l) of th~ random
var1abl e T 'vhtch IS the dtstanc e betwee n the success1ve events
F (t) = P {T < t} = P {at least one event 10 the elementary Dow
occurs during the t1me t} == 1 - e ~~ for t > 0 hence f (t) = F (t)=
Ae i. t for t > 0
5 f9 Gt, en a Pot~son field of potnts on a plane w1th a constant
dens1 ty ;., find the d1str1but1on and the numer1 cal charac:terlStlcs m,
Var of the distanc e R from any po1nt tn the
field to 1ts nearest netghour
• Solutto n \Ve find the dlstrlbU tton funct1on
• F (r) of tbe variabl e R by draWin g a ctrcle
of ra d1us r about a po1n t 1n the fi.el d (F1g 5 f ~)
• • • For the dxstanr c R from the point to 1ts nearest
• • • • netghb our to he .smalle r than r at least one othe!
• • point must fall \Vlthtn the ctrcle The proper
ties of the Po1sson fields are such that the
F•g 5 19 probab1l1t y o~ thts e. . . ant lS 1ndepende.nt of
whethe r there IS a po1nt tn the centre of the field or not Therefore
F (r)- 1-e... nr•). (r > 0},
whence
f (r)- 2JtA.re nl,-s (r > 0)
Thts distrib ution lS known as a Rayle1 gh d1~trthut1on (see Problem 5 11
0)

mr = 5r2tti.re-~r' dr- 2YA 1


0
OD

[R] = Jr'2n,.re-nb• dr =
1
a2 nA ,
{J

Var, =a, [Rl- m:


=-1/(n~) -1/(4A )- ~4- n)/(4nA )
5.20~ Po1nts are located at random 1n a three d1mens Jonal space
The numbe r of potnts 1n a certa1n volume v of the space ts a random
var1a.ble wh1ch has a Po1sso n dtstx1b ut1on wtth expectat1on a = ~u,
Ch. 5. Continuous and llfixed Random Variables 125

where I. is an average number of points per unit volume. We have to


find the distributio n of the distance R from any point in the space to (
the nearest random point.
Solution. The distributio n function F (r) is defined as the probabilit y
that at least one point falls in a sphere of radius r:
F (r) = P {R < r}= 1-e-i.v(r) ,
4
where v(r)=g~ra is the volume of the sphere of raditt:; r. Hence
-~. ~ nr•
f(r) = 4nr 2 A.e 3 (r > 0).
5.21. The stars in a certain cluster form a three-dime nsional Poisson
field of points with density I. (the average number of stars per unit
volume). An arbitrary star is fixed and the nearest neighbour , the next
(second) nearest neighbour , the third nearest neighbour , and so on, are
considered. Find the distributio n of the distance Rn from the fixed star
to its nth nearest neighbour .
Answer. The distributio n function Fn (r) has the form
n-1

Fn (r) = 1- ~ ~~ e-a,
4
where a= 3 ~r 3 /. (r > 0),
h=O
••

the density function is

f 11 (r) = dF n (r)
dr
= an-1
(n-1)!
e-a4~1.r2 (
r
> 0)

5.22. Trees in a forest grow at random places which form a Poisson


field with density A. (the average number of trees per unit area). An
arbitrary point 0 is chosen in the forest and the following rauuom
variables are considered:
R1 , the distance from 0 to the nearest tree;
R 2 , the distance from 0 to the next (second nearest) tree;
• • • • • • • • • • • • • • • • • • • • • •
Rn, the distance from 0 to the nth nearest tree.
Find the distributio n for each variable .
. Solution. We found the distributio n function of the random variable
m Problem 5.19 to be
Fdr) = 1- e-n:r•A. (r > 0).
Th_e. distributio n function F 2 (r) = P {R 2 < r} is equal to the prob-
abihty that no less than two trees fall in a circle of radius r:
F2 (r) = 1- e-nr•A._ ~r2A.e-nr•A. (r > 0).
Reasoning by analogy, we obtain
n-1
(r > 0),
126 Applle d Prablem.1 In Proba bllllV !'htor y

To get the denst ty ln. (r), we d•fter cnttat e Fr~ (r) w1th respe ct tor, 1 !.,

n-1 n-l
( ) - dF n (r) _tkJ - ( - ~ k all-1 c-a ....1. <\1 ,a~ a ...a) 2ttlr
fn r - da dr - LJ Jcl £J kl
h~o k~o

an-1
== \n-i} \ o-<;2n:Ar (r > 0),

5.23 An autom attc traffic s1gnal at a cross ing 1s set so that a green
ltght turns on for a m 1n u tl' and then a red l1gh t turns on for 0 5 JD.tnt
then a green ltght turns on aga1n for a mJnute and a red ltght for hall
a minu te, and so on A certa1 n Petro v arrtve s at the etosst ng 1n a tal
at a random mom ent unrel ated to the opera tton of the traffic signal
(1) Find the proba htlaty that he \viii pass throu gh the cross ing" ttbout
f(t)
1
f(t)
Z/J 2/J
1/J

0 t
0 05
(!J)
f1g 5 23

stopping,. (2) ftnd the dtStr tbutto n and the nume r1cal characteriSttcs of
the wa1 t1ng time at the cross ing T w, (3) const ruct the d1str1bu tton
f unc tton F (t) of tne 'va1 ttng ttme Tw
Solut ion The mom ent the car arrtve s at the cross ing Is untformly
d1str1huted tn the Interv al 'vhicb 1s equal to the traffic stgna l C}tle,
1 e to 1 + 0 5 = 1 5 mtn (Ftg 5 23a)
For the car to pass throu gh the cross ing wttho ut stopprng~ 1t must
arrive durin g the Interv al (0 1} The proha h1l1t y that a rando m vari-
able un1fc~roly d1str1buted 1n the 1nter val (0, 1 5), w1ll !all 1n the
Interv al (0, 1) IS 2/3 The wtutt ng t1me Tw 1s a miXed rando m variab le
It Is equal to zero wtth prohabtl1 ty 2/3 or may assum e any value he tv. een
0 and 0 5 m1n w1th un1fo~m denst ty w1tb proba b1l1ty 1/3 The graph of
the d1str1butron funct ton F (t) 1S show n 1n Flg 5 23b
The avera ge wa1t1ng time at the cross1ng
l\f (Tw1 = 0 2/3 + 0.25 1/3 ~ 0 083 m1n
The varia nce of the wa1tt ng t1me
0 5-
Var[T w]=c taiTw ]-(t\l iTw) )2-=0 2 _?_+ l \ t" 1 dt
3 3 J 0 5
0
- (0 083) 2 ;::: 0 0208 nun2 , a1'N ~ 0 144 m1n
Ch. 5. Continuous and Mixed Random Variables 12T

5.24. The normal distribution function. A random variable X has a nor-


mal distributio n with parameter s m and a. Find its distributio n func-
tion F (x).
Solution.
F (x)=P{X <x}=P{ -oo <X <x}
=dJ (x:m) -dJ(-oo )=dJ (x-;m) +0.5.
5.25*. Show that a function of the form
/
8
(x) = ax•e-a•x• for X> 0
[a > 0 and a> 0 are constants and s is a natural number (s = 1 t-
2, 3, ... )] possesses the properties of a probabilit y density function.
Find the parameter s a and a given a mean value mx, and find Varx.
Solution. The parameter s a and a can be found from the conditions
00 00

Jax•e-<ax)• dx = 1, JaxS+ e-(ax)• dx = mx.


1

0 0

Using the substitutio n (ax) 2 = t, we reduce the integrals to the gamma:


function
oc
rJ x•e-(etx)• dx-- 1
2cx.S+l
0

00

where r(m)= \ e-ttm-1dt (m>O) with r(m+1)= mr(m.); for the



. 0
n = 1, 2, 0.. we obtain r (n+ 1) = n!, r (n + 1/2) =
Integral
(2n-1)ll
2n
v-
n, (2n-1)!1 =1, 3, 5, ... , (2n-1).
From the given conditions we find that

ar ( sti )1(2aS+1> = 1, ar ( st 2
1
) (2a•+2) = mx,
whence

The second moment about the origin


128 Apphed Problem! ltt Probabtlity Theory

'\Vltcnce
(S + f) fl t( B

:""-:- --- -
f )
1
------
2f" { •;l ) t

Some of the d 1s tr tbu tto ns of the form f, (x) have defin1 te names
e g / 1 (z) Is lt.no,vn as a Rayleagb lhslrJh u linn. and f 2 (x), as a 'faxnell
dtstr!b uhon I or a Raylei gh distrib ution (s = 1) \\ e ha\ e / 1 (x) =
ax e-a:axt (x > 0)

a.= ~.x l:"ir. , a= 2a.z= .!=~ Var,.,= m~ [! -1 ].


For a ]tlax" ell d1Slrtbut1on (s = 2) \ve have f 'J (x) = a..t 2e-a';.:•
2 4~J.. 3 32 ~ ( 3n 1)
a.= r-'
m~ 1 n
a=v-==
:t
2 ~·
.n mx
Var~=m% 8 -

ne~nark All the dJstrlbUtlODS of the form


-a. .ax•
1. (z) = o.z'e (.r > 0)
for ~ b !ven s have a ~nngle paramet er 1 e they depend only on one pd.rameter. whtch
rna' bo e1tber the mean value or the variance ..
5 2fJ*. \[amen ts of a normal dt:;tr~butlon G1ven a random var1able X
\Vh1ch has a normal dJ.StribUtlon With parameters m and a, fi.nd the ex pre~
s 1on for the q uant1 ty ex, [X], wh 1ch ts the sth momen t about the origln
Solutio n Let us express the momen ts a 1 [XI .:=:= ?\f [X'] about tha
or 1g 1n 1n terms ol the central momen ts f.'-, I X} = 1\.1 l (X - m)'}
I

as=~~ [(X- m + m)'] = ll-O


~ c:Jlk [XJ m•-h ~-to [X]= 1.

For central momen ts WI til odd s = 2n + 1 t '' e have


.. )oo - _1 (.x .... m )* oo J '
fl, IXJ = -'r~ (x- m)• e 2 u• dx =.. 1-....!.....- \ y~c - 2oj II dy ===== 0
a l' 2n a y 211: J
-oo -~

and With even s = 2n, by the formul as 1n the preced1 ng problem.,


l\e have
oo 1 oo ( y )I
Jl, [X]= :- I y•e- 2o• :u• dy= 2_ \ y•e- o V2 dy
a 2n J
-~
a Y2tt J
0
r 2n.+t
a 2n ~

For exampl e,
1-12 [XJ = az,. a2 [~Y.J = m2+a2 ,
a 3 [XJ = m3 +3a:m ,
Ch. 5. Contin uous and ll!ixed Rando m Variables 129

ft~ [X]= 3cr4 , a~ [X]= m4 + 6cr2 m2 + 3cr~,


a 5 [X]= m5 + 10cr2 m3 + 5. 3cr!.m,
fta [X]= 15cr 6
, +
a 6 [X]= m6 15cr2 m4 + 15. 3cr4m 2 + 15cr 6

5.27. A rando m varia ble X has a norm al distri butio n with param eters
m and cr. Write an expre ssion for its distri butio n funct ion F (x) =
P [X< x}. Write an expre ssion for the distri butio n funct ion
lJf (y) = P {Y < y} of the rando m vari- ·
able Y =-X .
Solut ion. In accor dance with the solu-
tion of Probl em 5.24
F (x) = <D ( x:m) + 0.5,
\fl(y) =P{Y <Y) =P{ -X <Y}
=P (X> -Y)= 1-F (-y) 0
--
= <D (y~m) + 0.5. '
Fig: 5.28

5.28* . The distri butio n funct ion F (x) of a nonn egati ve rando m
variable X is given by a graph (Fig. 5.28). The mean value of the ran-
dom varia ble X is mx. Show that mx can be repre sente d geom etrica lly
as the area of the hatch ed figure in Fig. 5.28 (boun ded by the curve
y = F (x), the straig ht line y = 1 and the axis of ordin ates).
Solut ion. We have
00 co co

lnx= Jxf (x) dx= ~ xF' (x) dx= - Jx [1-F (x)]' dx. 0
0 0

Integ rating by parts , we get


co co

mx= - x ['i-F (x)] I+~ [1-F (x)l dx.


0 0

Let us prove that the first term is zero:


co

x[1- F(x) ] I =lim x[1- F(x) ]=0.


0 X-+o:>

Indeed, for ihe nonne gativ e rando m varia ble X, which has a finite
mean value , it follow s from the conve rgenc e of the inteO" ral
~
b
co

) xf (x) dx that ~ xf (x) dx-+ 0 as [(-+ oo and, since


0 l:.
co co

J( ) f (x) dx~ ~ xf (x) dx


K J{ .
f 30 A pplled Pro bltmJ ln PrO bablhty T h~oru

tt foiJo,vs that A [ 1--- F (/()]-,. 0 ns A • ,.. oo Con cequen tl y • 11m .tIt-


:x .... oo
F (z)] = 0 11e-nco
OCI

m~ = ~ {1 - F (x)l d.r,
0
and thts 1s precise!} the area hatched 1n F1g 5 28
5.29 A random vartable X has a normal dtstrihution \VIth mean value
m = 0 (F1g 5 29) Gtven an 1nterval
f(:r} (a., ~) wh1ch does not tnclude the
or1g1n. for \Vhat value of the mean square
deviation a does the probabll1ty that
the random \ ar1ah\e X '\V1\l ia\\ 1n \n~
1nter' al (a, ~) atta1n a ma'timum?
Solution \Ve find the 'alue of a by
dtfferent1at1ng the probability of falhng
1n the 1nter' al (a, p) w1th respect to
n c(. p a and equat1ng the der1vattve. to zero
F1g 5 29 \Vo have

~,

= ae-
(ll

~e 2o 2ol

and concequently,
.... / ~~-ct 1 .. / ~~a. ~-a
0
=V 2 (lll ~-Ina)= V 2 "In ~-=In a •
For a .small Interval (a - e,. a + e)
a ~a [1 - (e/a) 2/61 ~ a
For Instance, for e/a < 0 24 the formula a ~a ytelds an error, sma11er
than one pt1r cent
5 30 A Taudom var1able X has a normal d1str1butron w1th expecta
tton m and mean square dev1atton a 'Ve must approl.Imate the normal
distrihutJon by a uniform dtstr1hutron 1n the Interval (a, fl) "tth the
boundar1es ex and fl chosen such that the mean value and variance of X
are ret1.1ned constant
~o"lu)lon ·~Por a uni1orm Olslributton on fbe 1nterva1 (ct "'Vl
~~ [.\ 1 = (a + ~)/2, a [XI = (~ - a)1(2V 3},
(a + ~)/2 = m~ (~ - a)/2 ya) = a
C:h. 5. C:onttnuous ana illlxea lf.anaom v arzaules "''J.

Solving these equations for a and ~. we have

a = m- a V3" ~ = m + ay3.
5.31. A continuous random variable X has a probability density
function f (x). As a result of an experiment it is found that an event

f(:c)

0 '--v--' X
0 aJ ..1 lo ..1

Fig. 5.31 Fig. 5.32

A = {X E (a, ~)} [the random variable X falls on the interval (a, ~)1
occurred. Find the conditional density I A (x) of the random variable X
provided that the event A occurs.
13
Solution. P (A) = ~ f (x) dx. By Bayes's formula (5.0.24) we have

13
f(x)/{Jf(x)dx} for xE(a,~),
fA (x) = (5.31)
0 for x ~(a, ~).

The distribution curves f (x) and fA (x) are shown in Fig. 5.31. The
ordinate of the curve fA (x) at each point is equal to that of the curve
f (x) divided by the area S hatched in Fig. 5.31. Since S < 1, it fol-
lows that fA (x) >I (x) for any x E (a, ~).
5.32. A factory manufactures homogeneous articles whose rated size
is 10 but for which random size deviations are actually observed that
have a normal distribution with mean value m = 0 and mean square
deviation a. The sorters reject every article whose size differs from the
rated size by more than a tolerance 1.1. Find the probability of the event
A = {an article will be rejected}. Find and construct the probability
density function for the size of the article which passes the sorting.
Solution.
P (A) = P {I X I> 1.1} = 1 - 2<D (Ma), P
-
(A) = 2t:l) (1.1/a).
By formula (5.31) we have
x•
j f (x) 1 -
.A (x) = p (A) = 2crl/2n «1> {6./cr) e 2u• for IX I< 11.
t32 Appl ed PtobtemJ ln ProbtJ.btltly Theory

The s1ze L of an arttcle wb1ch passed the sort1ng is equal to the


error X plus the rated size l 0 and has the cond1t1onal density
(.:t-leS 1
(f'- ( x) - t o- 2o•
A - 2cr l/ 2n tD (6./a)

shown 1n Ftg 5 32
5 33 .. Tbe xeliab1l1ty p of a dev1ce depends both on the total elapsed
t1me 'f from the moment th~ de-v1co 'vas turned on and on whether \ht.
voltage regulator faded at some moment t < "C. If the voltage regulator
does not fatl unt1l the moment "t', the reliab1l1ty 1.9 defined by the lunc
tton p =: p0 (-t), 1f 1t ia1ls at \he moment t < ,;, then tho rel1ah1lJ.\y 'lS
a function of two arguments p = p 1 (-r, t) The ttme of the failure free
performance of the voltage regulator ts a random variable T with den
Stty 1 (t) Flnd the dlstttbutton functton Fe(:t) of the tune e of \he
failure-free performance of the dev1cc and 1ts expectation me
Solution The complete rel1abtl1ty of the dev1ce (the probabl11ty of
its fa1lure free operation for the t1ma tt) can be found from th-e total
probab1ltty Integral formula
~ ~

P ('t)- JP (t, 't) /{t) dt +Po ('t) J/(t) dt


0 ~

The func.tton of time dJStrihUtlOD a


Fe (x) = P {0 < z) = 1- p (x).

Stnca the quanttty e lS nonnegativeJ It follows (see Problem s 28)


that
co <10

me=M j6)= J{1-Fe(.:t)) dx= ~ p{'t)d't'.


0 0

5 34. A message S 1s expected over a communicatton channel The


moment T the message IS rece1ved IS acc1dental and has density fune-
tton I (t) At a moment T 1t IS found that the message has not yet ar-
rived. Under this condttton find the dtstrtbution density {J) (t) of the
time e wh1ch remains until the arrival of the message S.
Solution. Aecord1ng to Ba}es'sformulafA (t)~ wh1ch 1s the cond1t1o..
nal dens1tyofthe quantity T prov1ded that an event A = {the message
has not arr1ved by the moment "t} occurred., 1s
· ,·ttr ~ rt~f
p (AJ - " f
fJt.lt)=
1.- r t (t) d t
5 or t > 1''
0 for t <-r.
Ch. 5. Continuous and .Mixed Random Variables 133

Since e= T - ~, it follows that


I (t)
't'
s
1- f (t) dt
for t > 0,
(j) (t) = 0

0 for t < 0.
Fig. 5.34. shows f (t) and cp (t); for the distribution cp (t) the origin
coincides with the point ~. . .
5.35. The moment T an event A occurs is a random vanable w1th an
exponential distribution, i.e. f (t) = A.e-:l.t (t > 0). It became known

r(tJ _..___ .

0 T t 0 t·
Fig. 5.34. Fig. 5.35

at a moment ~ that the event A had not occurred. Find the conditional
distribution density cp (t) of the time 8 remaining till the occurrence of
the event. '
Solution. The event B = {the event A has not occurred by the mo-
ment ,;} occurred. Thus
't'

P (B)= 1- JI (t) dt =e-A-r:.


0
f B (t} = f (t)/P (B)= J..e-:l.t /e-A-r:= J..e-'- <t--r:) (t > 't).
The random variable 8 = T - ,; has the conditional density
cp (t) = A.e-'-'t for t > 0,
i.e. an exponential distribution which coincides with I (t) (see Fig. 5.35).
Thus, the conditional distribution of the time 8 which remains till
the occurrence of the event does not depend, for an exponential distribu-
tion of T, on the elapsed time. An exponential distribution is the only
distribution which possesses this property. Recall that the time inter-
val between two successive events in an elementary flow has exactly an
exponential distribution: the time remaining till the occurrence of the
next event does not depend on how long we have been waiting for it
(this follows from the absence of aftereffects in an elementary flow).
5.36. The probability of a failure of a radio valve at the moment
it is turned on depends on the voltage V in the circuit and is equal to
q (V). The voltage T' is random and has a normal distribution with
134 Applied Probltms in Probablhly Theory

parameter s v0 and u 0 F1nd the total probabtltt y q of a fat lure of the


,al\e a the moment It 1s turned on.
So1uttu-n. By the 1ntegrel total ptobah1lity formula (50 23) we ha'ie
(f'-'f:n}~
co co -

g= l q (v) j {v) dv = 2._ ( fJ (v)e 2a: dv.


J
-~
Jf 2"t Ot J
-~

5 37 ~ random voltage V., \Vhtcb has a probabtltt y dens1ty I (v)t


lS passed through a voltage ltmtter whrcb cuts off all voltages smaller
than v1 and larger than v2 , lfl the first case r.a1s 1ng the voltage to v1 and
1n the second case lo\\ er1ng It to V2
F1nd the dtstrLbUtlon of the random
-
f(y) f(Y) ......
F(v)
,.,
variable V, the voltago whlch bas
1 .......... --~.,....-.-.... ....... .,..._--~-~
/ passed thro\1gh the l1m.1ter~ and
,1-r"' determtne ltS mean value and
'ar1ance
Solution41
V1 for l' < V1...,

0 v
v= V for v1 < l' < v~,
Vz v2 for V>v-z
,..,
Ftg 5 37
The random variable V lS a m1xed
quan t1 t y and 1 ts two values t'1
and ti 2 ha' e nonzero probabtltt tes p 1 and p 2 For all the values between Vt
and v~ the dts trtbu lion func t1on F (v) of the v ar 1a ble V 1s con t tn uous and
00
"
J/(v) dv
'l'

F (v) = \ f (v) dv,


...
-oo
Pt = JI (v) dv,
-I)D
p2 =
t,
The graph of the funct1on F (v) 1s sho\vn 1n Ftg 5 37.
tl

l\1 (VJ = VtPt + VtP2 + Jvf (v) a~,•,


't1i
Var IV)= a~ tVJ- (l\l{V])2,
t

a:, ll) = v1p, + v1p 2 + v''f (v) dv r


"•
5~38. A message ol length l 1s be\ng bToadcast over a rad1o channel
(F1g 5 38a) In order to eraze the message, a notse pulse tra1n of length
b > l1s InJected 1nto the channel so that the centre 0 1 of the tratn coin·
etdes \v1th the centre 0 of the message
Because of accidental errors, the centre of the pulse train proves
to be d1splaced by X relat1ve to the centre of the message The random
vartable X has a normal dtstrtbutJo n w1th parameter s m = 0 and o ==
l/2. F1nd the dtstr1butt an of the -random var labl~ U, the length of the
Ch. 5. Continuous and ll!ixed Random Variables 135

message which is eraze d, and the mean value mu, the varia nce V ar u, and
the mean squar e devia tion au·
Solut ion. The rando m varia ble U is mixe d; it assum es the value s 0
and l with nonze ro proba biliti es. On the inter val from 0 to l the dis-
tribut ion funct ionF( u) is conti nuou s. The noise train b does not touch
F(t)
1 ---- -·-, ::-- --

X
l }\\----;
frzuw 1'@<<f'?sssy ' J&W§S S:SSSS )' t
0
Po
l t
u h
{a) {b)
Fig. 5.38

the message l (U = 0) if its centr e is furth er than (b + l)/2 from the


origin.
Po= P{U =O} =P{ !Xl> btl} =1- P{1 Xl< btl}
= 1- 2cD ( btl ) = 1 - 2cD ( btl ) .
For the noise b to eraze the whole messa ge l (U = l), its centr e 0 1
must be less than (b - l)/2 dista nt from 0:
p 1 =P{ U=l }= P{IXI < b;l }=2c t>(b -;z).
For (b - l)/2 < X < (b +
l)/2 a part U of the messa ge l is eraze d
(0 < U < l). We find the distri butio n funct ion of the rando m varia ble
U: P {U < u} for 0 < u < l. For the eraze d part to be small er than
n, the centre 0 1 of the pulse train must be furth er than (b l)/2 - +
n = (b + l - 2u)l2 from the centr e of the messa ge:
F (u) = p { lXI > b+l;21t} =1- p { lXI < b+Z;-2u}
= 1 _ 2Q> ( b+ll-2u).
Thus, on thein terva l from 0 to l, we have F(u) =1-2 cD ( b+l- 2u)
(Fig. 5.38b). l
To fmd the mean value and varia nce of the rando m varia ble U, we
must find F' (u) on the interv al (0, l). Takin g into accou nt that
b+l-2 u
l
<D (x) = ri__ <D j b+Z-2u ) = 1 --
I'

l 2:t \ l y21[ J
0
e 2 dt
!36 Applie d Probl tml In Proba blllfy :lhtor v

and dtfferenttatJng F (u) With respect to the var1able Ur 'vhtch appears


1n the upper l1 m1 t "e obta 1n

F' (u) du == ll'Z i e


I
M{U ]=O p0 +lp1 + JuF (u) du,
Q

-
Pp + Ju F (u)du , 2
a 2 fUJ=O Po+ 1
0

\ ar (U} = ct2 (U] -(I\1 [U]) 2 , a 11 = lfVa r tUJ


5 39 The random varta ble X may have o d1str1button density /l(.t)
wttb proba bility p 1 a densi ty f 2 (x) wtth proba bility p 2 , , a den

f(~)

DJ
f(X)

0 ~ 4 :r 2 f 0 I 2 34 i t
FJg. 5 40

s1ty I 1 (..t) (1 = 1 , n) w1th proba .b!ltty Pt• F1nd the total (aver
age) denst ty of the rando m '\arta b1e X
Sol uti on \\ e find the elem ent f (x) dx of the p.roh ab1li t y by the total
proba bility formula on the h,-potheses H 1 = {the dens1t~ of the ran
dom 'ar1a ble Is fl (x)} (t = 1 J n) By tho total proba bility for-
mula
ft.

I (x) dx = ~ Plft (%) dx


t~1

whence
n
I (x) = ~ Ptft (x)
~~

In part1 cular t 1£ there are two hypo these s and their proba htlJtJ es are
P1 = p,. = 1/2, tben I (x) 1s half the sum of the densi ties / 1 (x) and
12 (x) (F1g 5 39) Two hump~ on a distri butio n curve always 1mply
that the distri butio n has been obtai ned by avera gtng two distr1 button s
of dtfierent types
Ch. 5. Continuous and Mixed Random Variables 137

5.40. The random variable X may have a normal distribution with


parameters m = 0 and cr = 2 with probab~lity 0.4 or .a .normal di.stribu-
tion with parameters m = 2 and cr = 1 With probability 0.6. Fmd the-
probability density function of the random variable X.
Answer.
f( x ) -- 0.4 -g-+
x•
0.6
V2n
(x-2)'
2
2 y2n e •
e
The graph off (x) is shown in Fig. 5.40.
5.41. Two independent random variables, a discrete variable X
with an ordered series
X. x1 \ x2j ... j Xn
I I
. P1 Pzj .. • Pn
and a continuous variable Y with density f (y) are summed up, i.e ..
Z= X +Y. Is the resultant random variable discrete, continuous or
mixed? Find its distribution. ,...,
Solution. The random variable Z is continuous. Its density f (z)·
can be found by the total probability formula on the hypotheses H 1 =
{X = x1}, H 2 = {X = x 2 }, • • • , Hn = {X = Xn}· For the ith..
hypothesis, the conditional element of the probability of the random
variable Z is
/ 1 (z) dz = f (z - x 1) dz •. -- ~- ·~

The total element of probability


n
f (z) dz = i=i
LJ Pd (z-xt) dz,
whence

5.42. Given a variable Z = min {X, a}, where X is a continuous-


random variable with density f (x) and a is a nonrandom variable, is-
the given variable discrete, continuous or mixed? Find the distribution.
of the random variable Z, its mean value and variance.
Solution. The random variable Z is defined by the formula
., X if X< a,
Z= a, iE ..

X ~a.
The value of the random variable Z = a has a probability
00

Pa=P {X;> a}= P {X> a}= j f (x) dx.
a
. For Po
tmuous.
> 0 the random variable Z is mixed and for p = 0 it is con-
a
138 A pphed Probleml tn Probabtlttu :I'h!orv

For Z < """ (z) of the random var1able Z


a the d1stribUt1on function P
:r
(;OlflCtdes w1th F (x) = ....ff (x) dx (Ftg. 5 42)
-oQ

r..I[Z) = apa + ~ xf (x) dx,



-oo

-oo

If P {X ~ a} = 0, \hen Z = m ln {X, a} = X, F"""' (z) = F {z).


5 439 PeriodiC s1gnals of the same duration l and w1th the same In·
tervals L between them are sent from t"o sources A and B (F1g. 5 43a),
l < L The moments the signals are sent
from the sources A and B are not ttmed
~

f(z} \\'hen the messages, one from each sou.rcet


1
J PtJ ..,..-~ overlap, they are chstorted F1nd (1) the
Ff:r) 1 _,~ f(:rJ p?obab1ltt y ll that at least one message '\Vlll
f
he dtstorted (completely or partially), (2}
f(x) ~ the probabtltty that no more than 10 per
cent of each message \Vtll be dtstortedt (3)
0 a the d 1S tr1 button function of the random
variable Z wbJcb ts the "length of the dis-
Fig 5.42
dtstorted text z -
torted text", (4) the average length of the

Solution. Assume that the B ax1s (soe F1g 5 43a) overlaps the A-axJ.S
.at random S1nce the beglnn1ngs. and the ends of the messages are lU
funct1o nal rela t 1ons, 1t 1s sufflc1en t to consider only one pa1r of adJacen
-segments on the A axts .. 1 e l .. the message, and L~ the 1nter' al (F.tg 5 43b)

A -1- .....11:!:z:!!=:z:;r;z:!::~!:ew~~£~..----z:~t;; t::ZZ:12ll___;:,L-....tt:::t'.IJ:z::::lf rr:a..ra_ _


1:Z2 f:=:i? t:zz;t
f(t}
flr-v-.,...,.....1'"'1,...._~.___.--

B ..:. ~...---oct=:):::';;:::::rr---.::....{--,c;:;:::,'l:i~l:='=:Jjr--~£--ex:::S:5~;\'=:.~ t :r-)

{a)
Po
f %
(C)

Fig 5 43

1-Ve take the begtnntng of the 1nterval l on the A -ax 1s as the reference
-potnt and denute the abscl.Ssa of the beg1nn1ng of the nearest Interval t
.on the B-axts by X. The variable X ts dtstr1buted With constant denstty
.QD the tnterval L +
l. The messages \Vtll not evtdently overlap when
Ch. 5. Contin uous and Mixed Rando m Variab les 139

~. l< X <
L; the proba bility of this event is
Po= (L- l)/(L + l), R = 1 -Po = 1 - (L- l)I(L l). +
To calcu late the proba bility p that the messages will overl ap by no
more than 10 per cent, we must increa se the interv al L - l, which is
favourable to nono verla pping , by two interv als of lengt h 0.1l. Henc e
L- l + 2 x 0.1l = L - 0.8l; p = (L - 0.8l)I (L + l). The rando m
variable Z, the fracti on of the disto rted messages, is a mixe d quan tity.
For z = 0 it has a nonzero proba bility Po = (L - l)I(L l); for 0 < +
z < 1 we have F (.z) = [L - l (1 - 2z)]/(L + l); for z = 1 this
expression becomes unity : F (1) = 1, and betwe en z = 0 and z = 1
it increases linea rly (Fig. 5.43c). .
The mean value of the rando m varia ble Z is equal to the area hatch ed
z
: in Fig. 5.43c, i.e. = [1 - (L - l)I(L +
Z)J/2 = li(L Z). +
5.44. There is a conti nuou s rando m varia ble X with densi ty I (x)
(Fig. 5.44a). The observed value of the rando m varia ble is retain ed if

f(x)

,X

(h)
Fig. 5.44

it falls on the interv al (x1 , x 2 ) and is rejec ted if it falls outsi de of the
interval (x1 , x 2 ). A new rando m varia ble X (a "shor tened " rando m vari-
able) result s with a range of value s from x 1 to x 2 • Find the proba bility
~ ~

density funct ion f (x) of the rando m varia ble X.


"'
Solution. The sough t-for densi ty I (x) is the cond itiona l densi ty of
the variable X provi ded that it falls on the interv al (x1 , x 2 ). Let us cal-
culate the proba bility elem ent I (x) dx for the inter val (x, x +
dx) c
(xl, x 2) (see Fig. 5.44a ). By the multi plica tion rule of proba biliti es
f (x) dx = P {X E (x1 , x 2 )} t(x) dx,
-
where f (x) dx, the cond itiona l proba bility eleme nt, is the proba bility
t?at the varia ble X falls on the interv al (x, x + dx) unde r the condi-
tion that X E (x1 , x 2 ), and
•lx• x1

P {X E (x 1 , :r2)} = ~ I (x) dx, f (x) =I (x) j ~ f (x) dx.


Xt Xt
i40 AppUtd Prtiblem1 In Proha.lHllty Theory

,.
The curve off (x) 1s s1 m1Iar to that of I (:r) and can be found from it by
d1v1d1ng encb ordtnnte by the 8rea hatched in F1g 5 44b (~ee the tb1~
,;la,jl

cur' e in F1g 5 44b) Outstde of the Interval (x1, x 2) f (x) = 0


5.45 A 'voman states 'l\Iy husband JS of medium be1ght, but most
men are belo" med1um hetght '' Is the statement mean1ngless?
Solution The statement is not meaningless and can e\ en ba trueo
If the distribtitton density of the he1gbts of X men IS asymmetriC about
the mean value (expectatJon) as sho,vnt for example, 10 rig 5 45 The
area hatched 1n F1g 5 45 IS equal to the average fraet1on of men below
the med1um height mx, and 1t 1~
larger than the area left unhatched
f(z) 5 46 In the theory deal1ng w1tb
the ral~:lbJlity of devtcesl \VeibuJI's
dtsl.ribulion, wh1ch has a d!str1bu
t1on function
F (z) ~ 1-e-a.z-n (x > 0), (5 46)
0 1s a con!tant and n lS a
'vbere o; >
pos1ttve tnteger, 1s often used as the
dtstrtbutlon of the t1me of fad ure free performa nee of the dev1ce F1nd (1
the probabilttl dens1ty funct1on f (x) (2) the mean value and varJancE
of the randoro variable X 'vh1ch has a \Vethull dtstr1bUt1on
Solution
(1) I{%)= dF (x)/dx ~ nazn-•e-cr.xn.
CIQ

(2) M (XJ = Jxncx.x"' 1e•«.ll:n dx


0

)Ve make a change of the variable thus ax" = y, z = ct 1/ny f/r~ 1M

1 .1-n
dx=7a.-1Jn.y n dy.
(Iii:)
• f J 1-n
l\1 [XI= ~ nye-u ti a. -n !I n dy
0

=a-n ry'R e-Ydy=r(t++)a;-nt


\ 1)0 t 1

0
~

wher~ r {t)- ~ xt-le- dx 13 the gamma function.


0

!\1 I X 11= ) na:x'n +1e- o:.,:'"' aX


0
Ch. 5. Continuous and Jl!ixed Random Variables 141.

ooJ -( 1+....!..) 1+....!_ 1 _....!_ . 1-n


= 1UX(J., n y n e-Y- a n y n dy
n
0 .
2 00
2 2
= j yn e-vay=r ( 1+~) a--n;
a--n
0
Var [X]= a-ztn [r (1 +2/n) -{r (1 + 1/n)} 2].

5.47. The service life of a device is a random variable T with den-


sity f (t) (t > 0). At a moment t 0 , provided that the device has not
failed, a preventive maintenance is performed, and then it operates for
a time T1 with density ! 1 (t). If the device fails at a moment u < t 0 ,
an emergency repair is immediately carried out, and then the device
<>perates for a random time T 2 with density f 2 (t) (no more repairs are
made). Find the mean value E> of the time for which the device will
<lperate, excluding the time needed for the repairs).
Solution. Let T assume a value t < t 0 • On this hypothesis, the
(lOnditional expectation of the variable
00
is M [61 t] = t m 2 , where e +
m2 = Jtf 2 (t) dt is the mean value of the operating time of the device
0
after a repair. Now if t ~ t 0 , then the device will not fall till the mo-
00

ment f 0 , and M [E>I t] = t 0 + m 17 where m1 = Jt/ (t) dt.


1 Consequently,
0
for t < t0 ,
for t ~ t0•
The total (unconditional) expectation of the quantity
to oo
M[8]= J(t+m )f(t)dt+ J(t +m )f(t)dt
2 0 1
0 to
1o in
Jtf(t)dt+m Jf(t)dt+~t0 +m 1 ) I f(t)dt
.(X)

= 2
0 0 t.

to
= Jtf (t) dt+ m P {T < t }+(t +m 2 0 0 1) P {T > t 0 }
0
to
= Jtf (t) dt +m F (t 2 0) +(to+ m 1)(1-F (t0),
0

where F~(t) is the distribution function of the random variable T:


t

F (t) = St (t) dt.


0
t42 Appl td P oblems In Probab l 111 Tht(Jrlf

5 -18 A sequence of mess1ges of the S'lme length l are transmitted


o-ver a r4\dlo channel (Flg 5 48a) Wlth randosn 1ntervals Tl T 2
bet\\ een them The Inter\ als T1 T 1 have the .same dtstrtbutwn
A no1so Interferes w1th a message from ttme to t1.rne (5 48b) The moments
~hen the noi«:e begtns and ends are not connected 'v1th the c::equence of
the messages The durattan D of each tnterfcrence per1od ts accidental
and has 1n exponent1al dtstr1button 'v1th parameter Jl the duration
of an Interval bet'\\ cen Interference periods ts also acctdental and bas
an exponenttal d1str1bUt1on "\\ ltb p3rameter ~ If the 1nterferenc-e. per1od

0 ~
a)

Lfl'VVf -.;:V v- Y t F -., v 1' y If ~t i'Y


~ aJ
fig 5 48

lasts 1s long 9S a m cssage or a part of 1 t then e1 ther the 'vhole message:


or Its corresponding p'lrt 1s dtstorted F1nd the ,a, erage length of the:
message d1storted by Interferences 1 e the rat1o of the average length
of the dtstortcd text to the a\ erage length of tho transmit ted text
Solution 1 /fl ts tho average length of the Interference 1/v IS the
average length of the tnterv'\l bet\\een them the a"\erage fractton of
the tIme on the t a".: 1s occu p1od by tho Interferences ts
f/Jl v
~~-----
1!~+ 1/V _._ !-' +V
It Is evident that the same a\ erage fraclton of the messages \V1ll be dts-
tortcd b-y the- 1nterference lrrespcct1.ve of the d1s"irtbut1ons of their du
ra t1 on 'l nd the length of the Inter
'&' t
A_ _....., vals bet" een them tf ~ - 1/1\I [DJ
.. L r :
- - t v = 1/!ll (T}
) L

0 t 7:' 5 49 Durtng a ttmo 1: (an ob.ser


V'ltion perlod) a Signal arr1 ves "tth
Ttg 5 49 probability p The Signal appears at
any po tnt of the, 1nterval "C ~ 1th tho-
same probah1ltty dens1ly lt IS kno"n that at the moment t < T
{Fig 5 4U) the signal has not yet arr1 ved Ftnd the probability Q that
tt. \v-tlt a.rrtve dur1ng th~ r~ma.ln'i'r1g time 't - t
Solut1on Q IS Simply the conditlonal probabllity that the .signal
'"'Ill arr1 ve dur1ng the ttmo 't - t 1f 1t IS kno\\on that by the moment t
1t has not yet alJlved The total probahiltty p oi the arr1-val of the ~ng
nal during the ttme -r 1s equal to the probah)lity tpl-r: that It wtll arr1ve
d ur 1ng the t rme t pi us tho probabtlt ty 1 - t p/1: that J t wtll not atri ve
Ch. 5. Continuous and llfixecl Random T'ariables 143

during that time, multiplied by Q. Hence


p (1-t]'"c)
p=+P+(1-+p) Q; Q= 1-tp/1: .
5.50. The moment a signal arrives is a random variable T with den-
sity f (t). At a moment t < 't the signal has not yet arrived. Find the-
probability that it will arrive during the next time interval from t to
't (Fig. 5.50).
Solution. The problem is similar to the preceding one. We designate-
asp the probability that the signal will arrive during the time 't, = p
't

Jf (t) dt. Reasoning as in the preceding {(t)
0
problem, we find the total probability p
of the arrival of the signal during the
time 't. It is equal to the probability Pt
that the signal will arrive by the mo-
t
ment t, i.e. Pt = \ f (t) dt, plus the prob- Fig. 5.50
J
0
ability of the complementary event 1 - Pt multiplied by the condi-
tional probability Q that the signal will arrive during the remaining-
time ('t - t), i.e.
't t t
Jf (t) dt = Jf (t) dt + {1- Jf (t) dt} Q,
0 0 0
whence
't t 't
S J (t) dt- S I (t) dt S I (t) at
Q ....::0_ _--,-....::o_ _ _ =-'-I---::--;---,- _ F (1:)- F (t)
= t 1-.F(t) - 1-F(t) '
1- ~ t (t) dt
0

where F (t) is the distribution function of the random variable T.


If f (t) is an exponential distribution with parameter 'A, then Q =
:::: 1 - e- i.<•-t> .
. 5.~1. l!nder .the conditions of the preceding problem f (t) is a normai
d1s~nbuh~n With parameters m and a: t = m (i.e. the signal does not
arn.ve durmg a time m)*'· Find the probability that it will arrive during-
an mterval of length a starting at t = m.
Solution. -r = m+a,
mtu
J f (t) dt = <D ( m+~-m ) -<D (0) =<D (1) ~ 0.341,
m
Q ~ 0.341/0.5 = 0.682.
*l The problem is meaningful only if T is nonnegative, i.e. for m - 3cr > O~
CIIAPT.CR 6

Systems of Random Variables


(Random Vectors)

6 o ;,. system of two random variables (X, Y) can he geomotr1enlly Interpreted


a9 a tanaom potnt \\ ath eoordtnatcs {X Y) on,the :r v pJane (F1g 6 0 1) or as a ran•
dom veetnt d1rec\.t'd {tom th~ nt1gu1 to the po1nt (X 1 Y} whose compo.nents are tan.•
dom var1ables X and Y (F1g G 0 2)
A system of three random variables (X Y Z) 1.s rcprr-sented by a random point
.or a random trct()f ln a thrtt dlm~nslonal •pact, whJ1e a system of n random varza~les

9 y
- - .-4J (>,. Y) -~ (XY)
y y I
I I
a X 1J X $

rig 601 fig G02

(X1 X1 Xn> ts represented by a random point or a random v~ctor ln nn n d'men-


.ttanal spact
1'he jr)int pr~bability dulrtbuUon of two random vat1able-s (X, 'Y) (ot the pro~
.a.b1hty dlstr•button of a SJstem of two rand.om varJables) lS the probab1hty tha.t
both lnequal1he~ X < ~andY <yore Simultaneously satisfied. 1 e
F {.r, y) = P l X < z Y < y) lB l} 1\
F {z y) ean be Interpreted goometr1eally as the proba b1h ty of a random po1nt (X. Y)
falhng ina qua.dr3nt \Vhosa vertex ts (r y) the one. hatc.h~d in Ftg 6 0 3 The {lrOb

\ Ftg G0 4

l6.hlllty of a random po1nt {X Y} Iall1ng Jn a rectangle R whose s1des are parallel


[ttl the coordtnate axes and \Vhu~h includes 1 ts lower and left-hand boundaries and
<Ide~ not Jnclude the upper and tight hand boundaries (F1g 6 0 4} 13 expr~sed lD
\ te~ of a d1strl.b1Jt1.on (unttlO·n hy \he lo1m\lla
\- i. P IJ:X'._ Yl E_ R l = F f~ .. 6l, - 1! lrt~. t\\.- "€ IJ),y 'l, * 1! 'f' "). (~ 0 2)
_ The distributiOn function F (:r y) possesses the prOpertteS {1) P (-cot -(%)) ::::s
(- oo? y) = F (z, -oo} =
0., (2) F {+oo • +oc) == f, (3) F (z. +oo) ==
i. (~)~ F (+oo 1 y) == F1 (y) where !J (%) and F ~ (g) are the d1stnhntton Iunc·
;:s of the rand om var1a bles X and y, (4) F (.r ~ Ill f.!: a nondeerea!nn g fun ct 10n of
, •~l~ent!! z andy. (5} F (r'" g) IS eonttnuous on the left with respect to each co-
Ch. 6. Syst ems of Ran dom Vari able s 1.45

ordinate, (6) F (~, o) - F (ex., o) - F (~, I') +


F (a., l'f;;;;,: 0 for any a.~ ~. 'V ~
o (the last prop erty mea ns that the prob abil ity of fa ling in a rect angl e is non -
negative).
The joint prob abili ty dens ity of two cont inuo us rand om vari able s (or the dist ri-
bution dens ity of a system) is the limi t of the ratio of the prob abil ity of a rand om
point falling in an elem ent of a plan e !1x, !1y, adjo inin g the poin t (x, y), to the area
of the element whe n its dime nsio ns !1x, !1y tend to zero. The join t dens ity can be
expressed in term s of the join t prob abil ity dist ribu tion , i.e.
f (x, y) = fP·F (x, y)/EJx fly = F"xy (x, y) (6.0.3)

i.e. is the second mix ed part ial deri vati ve of the dist ribu tion func tion with resp ect
to the two argu men ts.
The surface repr esen ting the func tion f (x, y) is know n as the distr ibuti on surfa ce.
An element of prob abil ity for a syst em of two rand om vari able s is the qua ntity
f (x, y) dx dy which is an appr oxim ate expression for the prob abil ity of the rand om
point (X, Y) falli ng in an elem enta ry rect angl e with side s dx and dy adjo inin g the
point (x, y).
The prob abil ity of a rand om poin t (X, Y) falli ng in an arbi trar y dom ain D
is expressed by the form ula

P {(X, Y) ED} = )
(D)
Jf (x, y) dx dy. (6.0.4)

The properties of a join t prob abil ity dens ity:


co

(1.) f (x, Y) >0 and (2) 1JI


-co
(x, Y) dx~dy= 1..

The join t prob abil ity dist ribu tion func tion can be expressed in term s of the
joint density thus:
X y

F (x, Y) = J J. I (x, y) dx dy. (6.0.5)


-oo -oo

The prob abili ty dens ities of the sepa rate vari able s ente ring into a syst em can
be expressed in term s of the join t dens ity, i.e.
00 00

h (x)= ~ f (x, y)dy , f2(Y )= J f (x, y) dx. (6.0.6)


-co -oo

d" T!w c~nditional distr ibuti on of a rand om vari able ente ring into a syst em is has
its
tstnbut10n cnlculated unde r the cond ition that the othe r rand om vari able
a.o:sumed a definite valu e.
The conditional distr ibut ion func tion s of two rand om vari able s X and Yin a
sdyst~~ are designated as F 1 (xI y) and F 2 (y I x), and the cond ition al prob abil ity
ensih('s ~s ft (x 1 y) and / 2 (y 1 x).
There 1s a theorem on the mul tipli cati on of prob abil ity dens ities , i.e.
f (x, y) = It (x) f2 (y I x) or f (x, y) = / 2 (y) h (x I y). (6.0. 7)

The expressions for cond ition al dens ities in term s of abso lute dens ities are
ft(!! lx)= I (x,Y)
/I(x) for f 1 (x) =f= 0, f 1 (x Y) = f 12(x,(y)y) for /2 (y) =f= 0. (6.0.8)
'
t46 A ppll~d Ptable m1 In Pro babdltg TlttDrl/

The rando m va.rtable:S X, Yare sa1d to bo mutu allv !ndtp tnden t if the condtbOI1...
al probab1hty dJstr tbuh on of one does not depe nd on the value o1 the other vu,
I 1 ( z I y) == I 1 (z) or / 1 ( fl J z) = /2 (y} {6 0 9)

For Independent rando m 'arJa.b]es the multtphcethon theorem assumes ths


form
It (z. y) == It (r) fa (y) {6 0 10)
The moment of order k +
s abou~ tht origi n or the system (X, Y) I.S a quantity
ali, (X, Y) ~ ~~ [X ~~to Y•] {6 0 fH
Tbe etntr al mom tnt of ord~r k + ,. of system (X, Y) b a quantity
)11 ,lX, Yl = 1.1 \Xk ~~1
Form ulas ror cal cuI at lng mom en t.s
(a) for d1sa ete rand om van&bles
ak, (X. YJ = ~ ~ z:vjPlJ• (6 0 f3)
' )

f.lJu (X • Y) ~ 2] ~ (z,-m k)1t: {!IJ- m11)a Pi}!


\. J.

where PIJ == P {X = zt Y = UJ}t


(h) lor con t 1nuous rand om -v ar1abl es
co

a~~.. [X YI = ~
-oo
l rllg•l (:r, ll) d.r clrr, (6 0 f5)

.010

!J.u(X, YJ= JJ(z-m.~Jit(r-mv)'f(J:, v)d.t:dv, {6.0 1~


-f)Q

'\\here I (z g) Is a JOint probabilitY denszty


The tJrder of the mom ent o:h• (X., Y] or ~-'"-• [X Y) is the sum o! the indices k
and s
T.he cot-artance of t\\O random vatJables X. Y Is a second orde r .au:r:ed central
:room en t, 1 e p11
Cov.ry = J.I.Jl {X~ Y] = ~I {XY]
'C)

(6 a11)
It is conv enien t to calcu late the quan hty Cov-=:11 1n terms ol the .second m1xed mo-
ment abou t the or1g1n
Cov ~., == a 11 [X • 11 - rn:Em11 , (6 0 t8}
or, us1Dg aDother notat ton,
Covztt == ll IX Yl - lti [XI Ai (Y] (6 0 f9}
For 1noe:pendent random var1ables the cova rianc e 1s zero
The corrtlatton coe!ficfent (or the norm altze d covariance) r;ry of two rando m vana -
bl~ X and Y is a d1men.slonless quan hty
T (X .,Y) = r:xg == CoV:r:y/(a~ay)r (6 0 20}

wher e a,. ==- V V a r.:r === Y P.~u l X~ Y 1 and o~ I!;;;: yv ar" ::::;: Jr fl(l! (X, Y J

The correlation coefficJent is a measure of how close ly thi .taudom varia ble! are
linea rly relat ed ~
Ch. 6. Systems of Random Variables 147

Two random variables X and Y are said to be uncorrelated if their covariance or


the correlation coefficient, which is the same thing, is zero.
1 If two random variables are independent, then they are uncorrelated, but if they
are uncorrelated, they are not necessarily independent.
If the random variables X and Yare linearly related, i.e. Y = aX b, where
a and bare nonrandom, then their correlation coefficient rxy is ± 1, where the plus
+
sign or the minus sign corresponds to the sign of the coefficient a. For any two ran~
dom variables I r xy I ::;;;; 1. .
The joint distribution junction of n random variables X 1 , X 2 , ••• , Xn is the prob-
ability that n inequalities of the form Xi <xi are satisfied simultaneously, i.e,
F (Xt. x2,. .. , Xn) = p {Xl < Xt, x2 < x2, •.• , Xn < Xn}· (6.0.21)
The joint probability density of n random variables is the nth mixed partial de-
rivative of the distribution function
an
f (xi, x2, ••• , Xn) = F (.:~:I, x2, ••. , Xn)· (6.0.22)
0 x1 . 0.:r2, •• ., 0 X'n
The distribution function Ft(Xt) of one of the variables xi, entering into the system,
results from F (xi, x 2 , ••• , xn), if all the arguments, except for x 1, are equal to co +
F 1 (xi)= F (+oo, +oo, ... ,Xi, +oo, ... , +ooJ. (6.0.23)
The probability density of a variable x,, entering into the system (X1 , X 2 , ••• ,
Xn), can be expressed in terms of the joint probability density by the formula

(6.0 24)
-co -co

The probability density of a subsystem (X1 , X 2 , ••• ,XI!), entering into the system
(X1, X 2 , ... , Xk, Xk+t• •.• , Xn), is

ft .... ,k(Xto ... , Xk)=


rJ (n-k)
00
r 00

... J f(xi, ... ,xk, ... ,xn)dxk+l· .. dxn. (6.0.25)


-oo -oo

The conditional probability density of the subsyst~m X1 , ... , Xk, when all the
other random variables are fixed, is
f ( 1 ) f(xl, X2, • .. , Xn)
1 .... , k xh · • ., Xk Xn+I• • • ., Xn = fk (xk ¥ } • (6.0.26}
+1 • ···• n +to • • ·, ~n

If the random variables (X1 , X 2 , ... , Xn) are mutually independent, then
f (x1, X2, ... , Xn) = /1 (xi) /2 (x2), ... , In (xn)· (6.0.27)
T~e pro~ability of a random point (X.1 , X2, ... , Xn) falling in ann-dimensional
domam D IS expressed by an n-tuple mtegral, i.e.
n
P{(X 1, ... , Xn}ED}= JJ/(xl,
D
... , :l:n)dx1 ... dxn. (6.0.28)

. The correlation matrix of a system of n random variables (X X X )


tams the pairwise covariances of all the variables, i.e. 1' 2' ... , n con-
Cov11 Cov12 .. • <:::ovm
Cov21 Cov22 • • • Cov2 n
II CoviJ II=
• • • • • • • • • • • • • '
Covn 1 Covnn• " • "-v
'AI nn

t48 Applied Prablem1 In Probabtltty Th.eorv

0 0
where Cov11 === Cov::r::t:rJ ~ lf [X 1XJ1 Is the covariance of the random vanaMeJ
X 1 'rX1 c
ha conetat ion rna. t-ri x is s ymme. trie (Co v1J ~ ov nl aDd,. t beretore~ only
half the table Js usually filled
Cov11 Cov11 Cov1 n
Covl1 •~ Covin
~ ill "'" .. .. •

Covn"
The var1ances of the random variables X 1 X., • X n lie along the prtnc1pal
d1agonal of the correlat10n m.1trtx l e
Covu = Var (X11 (6 0 29)
The normalized correlation matrfz of a system of n random variables is com•
p1led from the pa1rw1se correla.t10n coetfic1ent.s of all the var1ahles, i e
t rlt
t '•"
Ttl
• 7 1n

rln,

U'~I~~= 1 r1n
'
...
1
wh~rn rlJ =
Cov 1,f(tf,t c{} 1s the coefnelel\t of eorr~lat1on ()f twc "Var1.ables
The normal prahabll t11 d1strtbutton of two random variables X, Y (the normal
Xt, X,.
d1~tr1bution on a plane) has a probab1hty dens1ty of the lorm

1 { i [ {%-mx)1
I (:r, !/) = 21'U1:t01/ y t -r I e"'tp - 2 (t - ri) rr!
v _

_ Zr (.t- m:e) hi- my) \JI-m II y• J} (6 0 00}


a%a 11 + o~ ;
where m...x:, m.ll are the mea.n values of the. raudom vanables X, Y 1 G$t ate tbetr a,
m ea.n square d ev 1at 1on!!I, and r 1~ the1 r co rrel a t1 on coeffte 1en t
For random va r1a hies . wh 1ch have a normal du~ tr1 bu tf on, uncorre1 a tl on 19 equ1 V·
alent to Independence If the random vt~rtable~ x . . Y are uncorrelated (1ndepen
dentL then r = 0 and

I (z, ll) == 2no:to 11 exp


f {
-- 2
1 ( (z-m.x)'
o~ + (Y -m")
o&
1
J} • (6 0 31)

In this ease the Oz and 0 II axes are ca lied the p rf ~ lp al axts oJ seattertng (dlJp
•ton) Jf, 1n add1hon m~ ~ mrt = 0, the normal d1str1hutlon assumes a canonktJ
'z
form
I (x* V) = 1 exp { - zl _ _II' } .. (6 0 32)
2na =-= 0' 11 2ai 2o-~

The pro lla b il1 t y that a rand 01n po tnt which bas a normal dtstrl hut 1on \VIll fall
Jn a rectangle R w1th s1des parallel to the pr1nctpal axes of se;~ttertng (see
F1g 6 0 ,), i5
P {(X" Y) ER}

=[ C%l ( ~-;;:x) -Cll( a.::cm:c)] [CI> { 6-::'J ) -cil ( y:;v )]• (6 033)

The elllplt of rqual probabiUtv d-tn1U1J (dlspers1on ell1_pse} is an ell1pse at -whose


all points the jowt probahtllty dens1 ty of the normal dlStrJhutlon is cons, tan\
Ch. 6. Systems of Random Variable s 149

f (x, y) =
const. The semi-ax es of the ellipse are proporti onal to Ux and cry: a =
kax and b = ku/1. • . . • • .
The probabi hty that a random pomt whtch h.as a ~ormal dist~Ibutton Will fa~l
in a domain Ek bounded by an ellipse of scatterm g With the semt-ax es a and b IS
2
P {(X, Y) E Eh} = 1 - e-h f2, (6.0.34)
where k is the size of the semi-ax es of the ellipse in the mean square deviatio ns,
i.e. k = alux = bluy.
If cr = cry = u, then the normall y distribu ted scatterin g is called circular.
For a ci~cu!ar normal scatteri ng with mx = my = 0 the distance R from the point
(X, Y) to the origin (the centre of scatterin g) has a Rayleig h distribu tion:
r2
-
e 2 cr
2
>
f (r) = o2
r for r 0. (6.0.35}

For the indepen dent random variable s (X, Y, Z) the normal probabi lity dis·
trihution in a three-di mension al space is
1
f (x, y, z) = (2n)3 / 2 cr xCJyOz
1 [ (x-mx) 2
xexp { - 2 oi, + (6.0.36)

The probabi lity that a random point (X, Y, Z) will fall in a domain Eh bounded
by an el~ipsoid of equal probabi lity density with semi-ax es a = kax, b = kay,
=
c ka 2 IS
(6.0.37)

Problenzs and Exercises


6.1. Two messages are being transm itted, each of which indepe ndent-
ly of the other, may be distort ed or not. The probab ility of an event
1 = {a message is distorted} for the first message is p 1 , for the second
IS P2· We conside r a system of two random variabl es (X, Y) defined
as follows:

X= { 6: if
if
the first message is distorte d;
the first message is not distorte d;

{ 6:
if the second message is distort ed;
y = if the second message is not distort ed
<X:and Y are the indicat ors of the event A in the first and the second
tnal).
Find the joint probab ility distrib ution of the pair of random vari-
abl~s (X, Y), i.e. the set of probab ilities PiJ for every combin ation of
their v~lues. Fin~ ~he joint p~o.babil.ity. dist.rib ution functio n F (x, y).
~~l?lion. The Jomt probab ility distnb utwn is defined by the prob-
abilities
Poo=P {X=O , Y=0} =q 1q2 , p 10 =P{X =1, Y=0} =p 1q2 ,
Pot=P {X=O , Y=1} =q 1p 2 , PH=P {X=1 , Y=1} =p 1p 2 ,
where qt = 1- p 1 , q2 = 1- p2 (see the table below).
t50 Appll~d Problemr In Probabllltpo 1heorg

~,

0 1
...
"' ......

0 q,ql PttJJ

1. qlpl P1Ps
~

The probabtlltY dtstrlbutton on the z. y plane is concentrated at lour


polnts \Vlth cootd1nates. (0, 0), (0, 1), (1, 0), (1 1 1) {Fig 6 1) Uslng
the geometrtc 1n terpreta tton of the probabtlt ty dJs tribution function u

j 9~z_ ___ ~P:

0
rig 6 2

the probability that a potnt Will fall in a quadrant \Vhose vertex Is a\


the po1nt (x, y) {see F1g 6 0 3), we get the following table of vatue3
for F (:t, y)
):

z~Q O<~~i. t.<x


11

y~O {l 0 0

0<y'1 0 ql.q'l qt

1<v 0 q1 !

6 2 The dtstrlbutton functton of a system of two random vsr1ables


(X, Y) IS F (x, g) F1nd the probability that the random po1nt (Xt Y)
w1ll fall1n the domain D (F1g 6 2)t \vb1ch 15 bounded by the absctssa ct
on the rtght. and by the otdtna.tes ~' 6 f-rom above and from below
Arunver P {(X, Y) ED} == F (a, ti)- F (rx. v)
Ch. 6. Systems of Random Variables 151

6.3. There are two independent random variables X and Y, each of


which has an exponential distribution, i.e.
/ 1 (x) = f..e-1.x (x > 0), / 2 (y) = ~e-ltY (y > 0).
Write expressions (1) for the joint probability density, and (2) for the
distribution function of the system (X, Y).
Answer.
for x<Oor y<O,
for x > Oand y> 0,
for x ~ 0 or y ~ 0,
for x>O and y>O.
6.4. A system of random variables {X, Y) is distributed with a con-
stant density inside a square R with side a (Fig. 6.4a). Write an ex-
pression for the probability density function f (x, y). Construct the

F(x,p)
'
y

y
a

o a .r a
0
(a) (!J}
Fig. 6.4

distribution function of the system. Write expressions for f 1 (x) and


/2 (y). Find out whether the random variables X andY are independent
or not.
1/a2 for (x, y) E R,
Answer. f (x, y) =
0 for (x, y) E£ R,
0 for x ~ 0 or y ~ 0,
xy/a2 for 0 < x ~a and 0 < y ~ a,
F(x, y)= y/a for x>a andO<y~a,
x/a for O<x~aandy>a,
\ 1 for x >a and y >a.
The surface F (x, y) is shown in Fig. 6.4b.
1/a for x E (0, a), 1/a for Y E (0, a),
f 1 (x)= 0, for xtf(O,a), !2 (Y)= 0 for y E£ (0, a).
t52 Applied Probltml ln Prababtlttv Thtorg

The random varJables X and Y are Independent s1nce


f (~. y) = ft (x) /2 (y)
6~~5 A d1strlbut1on surface of n S)slem of random vartables (X, Y)
1s a rtght cone (F1g 6 5a) "hoso base 1s a. ctrcle li wtth centre at the
or1g1n and ra d1us r 0 Ou ts1d e tho Circle the Joint probab 1h ty dens1t-y
funCllOll J (x, y) l~ r.cro (1) \Vrlte the expreSS lOll for I (:et y), (2) nnd
It (x), I 2 (y), I z (Y J x) 1nd / 1 (x J y), (3) find out 'vhether X and Yare
depenilent and (4) find out 'vhether X andY are correlated
Solution

( 1) I {x, ll) .:::: I or .x" -f y,. < r~,


for .:r,. + y'2. > r~t

(2) f, (z) ~
r:<~ [ r0 )I r~- .:r2 - x 2 In ( r,+ rx~~ -z' )] for lxl < ro
0 for JxJ > 'o•

j 2 (y) = 3
rln
[
To
y tu -y~- y~ ln
'i ( ro + l/IYIrB -11 1
} J for
0 for
Furthermore icr lx' < r 0

and for IYJ < r0

for
ro V r~-Y 3 --Y~ In ( ro + V r~--- y" }
lY t
0 for
(3} s1nce / 1 (x l y) ,& f 1 (x)., the vartables X and Y are dependent,
(4) "'\\0 find the covariance Cov~y S1nce mx = m11 = 0. we have

I xy/(x, y} d:rdy
Cov~v= ~
(K)

= ff:r Yf(:r, y}dxdu+ ff :cyf(x, y)dxdy,


(J\1) (1{.)

"here K 1 IS the r1gb t half of the c1rcle, K 2 IS the left half (F1g 6 5b) •
the funct1on:xy f (.x~ y) 1s odd WLth respect to the argument x and, there..
Ck. 6. Syst ems of Ran dom Variables 153'

fore, the inte gral s wit h resp ect to K 1 and K 2 differ only in sign. Hence,.
when the inte gral s are sum med , the y cancel eac h othe r, and Covxy = 0.
The !an dom -v~J'iabl~s X and Y a-r~ not cor rela ted ther efor e.
6.6. A pair of rand om var iabl es (X, Y) has a join t pro bab ility den sity
I (x, y) = a/(1 + x2 + x2y2 + y2).
(1) Fin d the coefficient a, (2) find out whe ther the ran dom vari able s X
and Y are dep end ent; find 11 (x) and I 2 (y), (3) find the pro bab ility tha t

f(x,y )

!!

:c

(a)
(b) -1
Fig. 6.5 Fig. 6.6

the random poi nt (X, Y) wil l fall in the squ are R whose cen tre coin -
cides with the orig in and whose sides are para llel to the coo rdin ate axe s
and are b = 2 in leng th (Fig. 6.6).
Solution. (1) Fro m the con diti on
00

~ JI (x, y) dx dy = 1
-=

we find tha t a = 1/n2 •


(2) The rand om vari able s X and Y are inde pen den t:
1 1
It (x) = n(1 +x2 ) , l2( y)= :~:(i+y2), f(x, Y)= ft (x)/2 (y)~
1 1

(3) P{X , Y) ER }= ) )
-1 -1
:~: 2 (i~:~y(i+Y 2) =!.
6. 7. There are two inde pen den t rand om vari able s X and Y. The ran -
dom var iabl e]!. has a nor mal dist ribu tion wit h para met ers mx = 0
and .ux = ltV 2. The ran dom var iabl e Y has a unif orm dist ribu tion on
the I~terval (0, 1). Wri te the expressions for the join t pro bab ility den sity
~~~c~l~ f (x, y) and the dist ribu tion function F (x, y) of the system
t54 Applied Probltrrsl In Probo. bfllt 11 TAtorv
--~~~~-------------------

Answer.
1
f(x, y)= Vn e-z• for YE(O, 1),
for y ~ Ot
0
F{l!, y}= y[«J(3:lf 2)+0 5] fot 0<Y ~ {,
fll(.xlf2) +0 5 for Y> 1
6 8 The probahilltY dJstributJon surface I (x, y) of a system of two
random var1ables (X, Y) lS a "rlght Clrcular cyl1nder, the centre of whose
.base coinctdes "1th the origtn (F1g 6 Sa) and whose altitude 1S h Find
rrx y)

t,(x}
2rh::-----t--.....

X
r 0 r r
(a) (h)
Fig &S

1ibe rad1 us r of the cyltnder, as well as J1 {x). f 2 {Y), / 1 {.r } y), It (y lz),
m$. Varx and Cov~v
Solut1on The rad1us r of the cylinder can be found from the cDn
.dttton that the volume of the cylinder 1s un1ty, whence r = 1/(nh) V
The JOJnt probabrlrt y density
h, tf z2 + y2 < r2,
f(x, y) = 0, tf z2+ y2 > r2..
Conseque nt Iy

for lx] < r,


!or (.x[ >r
Similarly
2 2
/2.(y)== 2Jfr -y h for IYI<r,
0 for ryJ>r
'The graph of the funct1on i1. (~).1s shown 1n F1q_. 6 6b For I.Y- I<:: r
~e have
1
f t (x Jy) == I t,(z(!I}II) -
for fzJ < V r2- y2~
-
for lzi > Y r2-y2
Ck. 6. Systems of Random Variables 155

Similarly, for I x I < r, we have


2 2 2 2
fz(ylx)= 1/(21fr -x for IYI<Vr -x , )
0 for IYI > r 2 -x2 • V
The mean values are zero, i.e. mx = my = 0, since the function
f (x, y) is even with respect to both x and y, viz.

Covxy=O.
-r
6.9. A random point (X, Y) has a constant probability distribution
density inside the square R hatched in Fig. 6.9a. Write an expression

1 y f fr(x) ft(x/y)

-1 1 X -1 f-ly/ X

-1
[a) (h) (c)
Fig. 6.9

tor the joint probability density function/ (x, y). Find the expressions for
the distribution densities / 1 (x) and / 2 (y) of individual variables X
and Y entering into the system. Write the expressions for the conditional
jprobability densities / 1 (x I y) and f 2 (y I x). Are the random variables
X and Y dependent? Are they correlated?
Solution. The area of the square is equal to two, and therefore
1/2 for (x, y) E R,
f (x, Y) = 0
for (x, y) ~ R.
1-x
{
{ ~ dy=1-x for O<x< 1.
-(1-x)
ft (x) = 1 1-!;x
1-2 •\ dy=1+x for -1<x<0,
Il 0 -(1+.~>

for x < -1 or x > 1,


1:>r, briefly,
for lxl < 1,
1-jxj
ft(x)=
for jxj > 1.
0
A ~raph of the probability density function / 1 (x) is shown in Fig. 6.9b
{Simpson's distribution). Similarly,
1-IYI for IYI < 1,
f2 (y) = 0
for IYI > 1.
i 58 A p pl1td Probl~ mr in Pro babtlfty Thtory

Furthermore, for I y I< 1, wo have


1
for lxl < 1- rvl,
for lzl > f -lui
A graph of the probabtltty derunty function / 1 (z ] y) IS .shown 1n Fig. 6 ~~~
Slmtlarly, for r X r < 1' 've have
for IY[ < 1-lxJ,
for fYr>1-rxl-
The random variables X and Y are dependent but not correlated
6 .. t0 The ]Otnt probab1ltty denstty of t~o random vo.rtables X and
Y 1s gtven by the formula
I (:r:, Y) = 1 ~ll exp { - 1
k l(z-2) -1 2(x-2)(yJ.-3)+(!1+3) J}
2 2

Find the correlatton coefficient of the var1ables X and Y.


An~-w ~'f ~ T :-: v -= G £>.
6.1 t A system of random v ar tables (X, Y) has a d1str1 but1on with
a probnb1l1ty den~nty 1 (x, y) Exp~ess the probabil1t1es of the follow. .
1ng events tn terms of the probabtllty density f (x, y) (1) {X> Y} .
(2) {X > I Y ,}, {3) { 1 X t > Y} and {4) {Y - X > 1}

9 !!

X 0

(aJ ~)
!J

X X

(t) (d)
F1g 6 tf

Solut1on .. The domains D 1 , D 2 , D 3 , D 4 correspond1ng to the occur·


:renee of even\s 1-4 are hatched 1n F1g 6 i1a d The -probablltttes or
po1nts falling 1n these domains are
%

1f(x. y}dxdy,
00

(1.) P{X>Y}~ S
-ao -co
Ch. 6. Systems of Random Variables 157

00 :X:

(2) P{X> IYI}= J~ f(x, y)dxdy,


0 -x
00 IX I
(3)P{IXI>Y}=) ) f(x, y)dxdy,
-oo -oo

1~
00 00

(4)P{Y-X>1}= f(x, y)dxdy.


-oo x+i

6.12. A system of two random variables X and Y has a normal dis-


tribution with parameters mx = my = 0; cr x = cry = cr and r xy = 0.

y y !/

X 0

(a) (b) (c)

Fig. 6.12

Find the probabilities of

A= {lYl<X}, B= {Y<X} and C= {Y<lX 1}.


Solution. Fig. 6.12ashows the domains corresponding to the events A,
B and C. For circular scattering the probabilities of the events are
P (A) = 0.25, P (B) = 0.5, and P (C)=
0.75. !/
6.13. A random variable X has a
probability density function f (x), and
a random variable Y is in a functional
relation with it: Y = X 2 • Find the
distribution function F (x, y) of the ;c
system (X, Y).
Solution. Since the value of Y is com-
pletely defined by the value of X, the Fig. 6.13
random point (X, Y) can only lie on
the curve y = x 2 • The probability that it will fall in a quadrant
with vertex at the point (x, y) is equal to the probability that the
random point will fall on the projection onto the x-axis of a section
of the curve y = x 2 falling in the quadrant (Fig. 6.13). Using this
t58 Ap pli ed Problem• In ProlJabtllty Theor1J

in te rp re ta tto n, we ha ve

0 fo r y ~ 0 or y > 0 an d z ~ - VY·
,r;
) /( x) dx for Y> O and r> l'Yt
F (.r, y) =
-'VII
X

~ I (x) dx fo r y > 0 a n d - V!i<z < YV~



-~
6~14. A ra nd om po1nt (X , Y) ha~ a no rm al dt str 1b ut to n
w tth pa ra m et er s mx = 1, m = - 1, <1x = f~ aJI on a planl'
11
F1nd th e pr ob ab tl1 ty th at th e ra nd om po tn t w1
= 2 an d Tz 11 == 0
ll tn ll tn th e do ma in D
bo un de d by th e el ltp se (x - 1) 2 + (y + 1)11/4 = 1
So lu tto n. Th e do m ai n D IS bo un de d by th e el lip se
of sc at te rin g E,
W ith th e sem1-axes a = ax = 1 an d b = of =
2, th e probab1hty of
th e po tn t fa llt ng 1n th 1s do m ai n p = 1 - e 12 ~ 0
393
6 15 Shells ar e flred at a po tn t ta rg et an d destr
Within a Circle of ra dt us r Th e sc at te rin g of th e po in oy everyth1ng
ts wh er e th e shell
la nd s ts Circular 'v1th pa ra m et er s m% = m = 0
(th e ce nt re of seat te rtn g co tn ci des wt th th e 11ta rg et ) an d ax = a, = 2r
H ow many shells
m us t be f1rcd •n or de r to de str oy th e ta rg et w1 th pr
So lu tio n Th e pr ob ab tli ty of de str oy tn g th e ta
ob ab ili ty P = 0 9?'
rg et
p = 1 - e-<o 51 /2 ~ 0 118 Th e nu m be r of sh ot s re w1 th one shel)
2
quired
n ~log (1 - P) /lo g (1 - p) = lo g 0 1/ lo g 0 882 ~ 18
.4 1 e
n = 19
6 f6 . A sy ste m of th re e ra nd om va ria bl es (X , Y, Z)
ab ilt ty dens1ty j (r, Yt z) \V rtt e ex pr es sio ns (1) ha s a JOtnt prob-
for th e pr ob ab tlt tf
de ns ity /1 (x) of th e ra nd om va ria bl e X, (2) fo r th e Jo
tn t probabtl1 ty den-
Si ty / 2 3 (y .z) of th e ra nd om va r1 ab les (Y,. Z) t (3)
fo r th e condttional
pr oh ab tlt ty de ns tty / 2 3 (y , z 1x), (4) fo r th e co nd
itt on al pr ob ab 1h ty
de ns 1t y /2 (y I x z), (5) fo r th e dt st rih ut to n fu nc
tto n F (x, y .. :), (6)
fo r th e d1 str 1h ut to n fu nc tio n F (x) of th e ra nd om
va
th e dt st rth ut lo n fu nc tto n F 1(x, y) of th e su bs ys r1 ab le X, (7) for
A ns \\e r. 1 2 te m (X , Y)
co
(1) 11 (x) = JJf(x, y, z;) dy dz,
-oo
00

(2) /, 3 (y, z) = JI (x, y, :) dx,


{3) /2 a (y, zr ~) =
--
GO I(% Y, :) J (4) 12 (y J.r, z;) = _CD......:I~C.:....x-=-'.:.-•·~>--,.
SJI (z, J/ du ds S j(z , 11 s} OJI
-· 1 :)

-oo
t
Ch. 6. Systems of Random Variables 159.

:t y z
(5) F (x, y, z) = .lr r .lr I (x, y, z) dx dy dz,
J
-oo -oo -co

:t IX> co

(6) Ji\(x)=P( x, oo, oo)= ~ ) ) f(x, y, z)dx dy dz,


-oo -oo -oo
:t y co

(7) F 1, 2 (x, y)=F (x, y, oo) = )


-QJ
) 1f (x, y, z) dx dy dz.
-oo -oo
'

6.17. A shell is fired at a point airborne target. The point where th(?.
shell explodes is normally distribute d with the centre of scattering at_
the target. The mean :square deviations <rx = <Ty = <Tz = cr. The target
is destroyed if the distance from it to the point where the shell explode&
does not exceed r 0 = 2a. Find the probabilit y p that the target will
he destroyed with one shell.
Solution. Using formula (6.0.37) for the probabilit y that the sheU
falls in an ellipsoid of equal distributio n density, we have

6.18. A system of three random variables (X, Y, Z) is distribute lt


with a constant density in the interior of a ball of radius r. Find the.
probability that the random point (X, Y, Z) will fall inside a con...
centric ball of radius r/2.
Answer. p = 1/8.
6.19. A ball is drawn from an urn which contains a white, b black..
and c red balls. The random variables X, Y, Z are defined by the follow-..
ing conditions:
if a white ball is drawn,
X= {5, if a black or a red ball is drawn,

y = {6: if a black ball is drawn,


if a white or a red ball is drawn,

Z= {5:
if a red ball is drawn,
if a white or a black ball is drawn

(X, Y, Z are the indicators of the events {a white hall}, {a black ballh
{a .red ball}). Construct the correlatio n matrix and normalize d corre..
lat10n matrix for the system of random variables X, Y, z.
Solution. We find the covariance s from the table of probabilit ies 0 ~
separate values of X, Y and Z. Using the notation
Pxiy}zk = P {X = Xf! Y = YJ, Z = ZJt},
'
1.80 AppHtd Problem& In Probab!ldg Th~ory

"'-Ve bave
P 000 =P {X = 0, Y = 0, Z -=z 0} == 0,
P 10 ~ = P {X = 1, Y ~ 0, Z = 0} = al(a + b + c),
Prao =P {X = O. Y = 1, Z = 0} = bl(a + b +c),
P 001 =P{X=0~ }J"~o~ Z=1}===c/(a+b+c),
Puo = Pto1 = Pou = P•u ==Of
a b c
mx = a+ b+ t m" = c+b+c ' m, = o:+b+t t

Cov:ttr = 2} (.x,- m:r) (YJ- m11 ) P:t,v.l'~t


t J "

= (1 -a+:+e) {O- a+!+e ) e+:+e


+{O- a+:+c) {1- a+!+c) 11+~+e
+(0-~-f:+c} (o-D+!+c} a+~+c =(a;;~t
Stmtlarly
-ac -be
Covxz~ (a+b+c~ ., Covv,~ (ts+b+c)i •

Next we find the vartance


v [XJ 2 a. a.1 a (6+c)
arx=~ -m:c== 11+b+e- {d+b+e-)1 = (!1+b+t)3 '

s1m1larly
Var~ = b (a + c)l(a + b + c} 2
and Var~ = e (a + b)l(a + b +
The correlation matrix 1s

Cov.:c-.:
nCov)) = Covv~
Varz
\Ve find the correlation coel&c1ents
Cov ~u -a~ ... ;- ab
Tq = YVatx Vat 11 ~Vat. («+') {b+ e) = - V (<t+c) (&+c)
Similarly •

ru=- V (a+b;[c+b) 1

The normalized correlatton matriX 1s


1. r :tv r tt~
u:rn == 1 ru:
1
Ch. 6. Systems of Random Variables 161

6.20. We have a system of random variables X and Y. The rand?m .


variable X has an exponential distribution with parameter 1.., I.e.
j (x) = t.e-hx for x > 0. For a given value X = x > 0 the random
~ariable Y also has an exponential distribution, but with parameter x,
i.e. f (y 1 x) = xe-xy~ for y > 0. Write the joint probability density
function f (x, y) for X and Y, find the probability density /2 (y) of Y,
and find the conditional probability density / 1 (x I y).
Solution.
l..xe-V·+v>x for x > 0 and y > O,
f (x, y) = 0 for x<O or y<O,
00
i..
(A.+y)2
for
y
>0
f2(y)=) f(x, y)dx=
-oo 0 for y<O.
Furthermore1 for y > 0, we have

f 1 (x I y) = f (:c, Y) -
x (f..+ y)2 e-(Hy)x for x> 0
/2 (Y) - 0 for x<O
6.21. Given two independent random variables: a continuous random
variable X with distribution density / 1 (x) and a discrete random var-
iable Y with the values y1 , y 2 , • • • , Yn which have the probabilities
p1 , p 2 , • • • , Pn. Find the probability distribution function of the system
of variables X, Y.
X

Answer. F (x, y) = F 1 (x)F2 (y),~where F 1 (x) = ) / 1 (x) dx.


0
0 for Y~YH
Pt for y 1 < y~y 2 ,
• •
h-1
F2(y)= {
2j Pt foryh-t<Y<Yh (k=2,3, .•• ,n),
I i=l
• • • • • • • • • . . .
. . . . . " . . .
l 1 for y> Yn·
6.22. X is a discrete random variable with two values x 1 and x 2 (x2 >
x~) which have probabilities p 1 and p 2 • A random variable Y is con-
tmuous, its conditional distribution for X = x 1 being normal with
me~n value x 1 and mean square deviation <J (i = 1, 2).
Fmd the j0int probability distribution function F (x, y) of the ran-
dom variables X and Y. Find the probability density function f 2 (y)
of tho random variable Y.
Solution. F (x, y) = P {X< x} P {Y < y 1X< x}. If x ~ .thP ....
P{X <x}=O and F (x, y) =0, assuming x1 <x~x1 ,~ ~
P{X <x}= P1 and F (x, y)= p 1·P{Y <I X =x1 }~ 'I
, ' ,..

11-05j5
r '
+O 5] Usmg th e to ta l pr ob ab th ty formula, we have for z> .z-1

Co ns eq ue ntl y

0 for z~:,

F( :t, y) =
p, [ lfl { "-;;;rt)+O 5] for z~ < :r~t1
Pt [ 4> ( y~;r 1 ) +O 5] + P: [ q> ( '::t•) +O 5] for .t> zt

Furthermore,~ se tti ng z = co an d d1fferenttattng w1th respect to y, \le


ob tai n
d 1 Cu ;r, )t (v- ;r,) .2
f: (y) = du F (oo y) = 0 V 2" [ p, 0- 2o1 + p 2e- 2o t J
6 23 • Th e sta rs tn th e sk y ca n be regarded as a Po l!s on field
Th e nu mb er of sta rs co ve red by a tel es co pe obJeCtlve len of po1nts
vartahle wh1ch bas a Poisson dt str Jb ut io n wt th parametesr tsAs-a random
1s th e area of th e portton of th e su rfa ce of a un1t sp he re th at where 1
can be
ob se rv ed th ro ug h th e telescope
(F1g 6 23 a) The vt ew field of the
telescope ha s a coord1nate gr1d
(F tg 6 23b) Show th at for an arb1t
ra ry ali gn me nt of th e telescope
:c th e co or d1 na tes (X , Y) of the star
ne ar es t to th e eross batrs have
IJ (a) a normal dt str tb ut to n w1th param
(b) ete rs m# = m11 :::::: 0 an d cr:l ==
Fig 6 23 all == 1t Vn2J.w
So lut 1o n \Ve sh ow ed tn Problem
5 19 th at the diStance R from the
ce nt re of th e cross-ha1rs to the ne ar es t po in t 1n th e Po
iS so n fie ld! ha d a
Ra yl etg b di St rib ut io n Bu t R === V x:a +
}'2 an d. co ns eq ue ntl y, the
pr ob ab tl1 ty th at th e po tn t (X , Y) w1ll fa ll 1n a c1rcle D
of rad1us r.
1 e P {X 2
+ ya < r} , ca n be vr rtt ten 1n tw o forms ett he r
r
P {R < r} = J2n1re R1r• dr
0
or
(6 23 1)
P{ (X Y) ED }= r I f(x
(D)
y) dx dy ,
Ch. 6. Systems o(Random Variables 163

where j (x, y) is the joint probability density of the variables X, Y.


By virtue of symmetry, we must assume that f (x, y) depends only on
the distance, i.e. f (x, y) = g (r), where r = 1fx
2
Y2 • If we use +
polar coordinates (r, <p), we obtain
2n r r
P{(X, Y)ED}= Jdcp Jrg(r)dr=2n Jrg(r)dr. (6.23.2)
0 0 0

Comparing expressions (6.23.1) and (6.23.2), we find that g (r) = A.e-nArl


and, hence, f (x, y) = A.e-nMx• + v2>, and that is what we wi~hed to prove.
6.24. A source of a-particles is at the origin of a sphencal system ot
coordinates (r, <p, 'fr) (Fig. 6.24), where 0:::;;;; <p ~ 2n, -n/2 ::::;; 'fr~ n/2.
The particles scatter uniformly in all direc-
tions. We consider a particle which moves
in a random direction defined by the angles
<D and e. Write the joint probability den-
sity function f (<p, -fr) of the random vari- 0
ables <I> and e.
Solution. If the a-particles are scattered
uniformly in all directions, then a unit Fig. 6.24
vector e from the origin defines the direc-
tion of the particle and all its possible endpoints on the sphere C of
unit radius will have the same probability density. Consequently, the
element of probability f (<p, 'fr) dcp d{} must be proportional to the
surface element ds on the sphere C. The surface element is dS =
d<p d'fr cos 'fr whence
f (cp, {}) dcp d'fr = A cos 'fr dcp d-fr; j (<p~ 'l't) = A cos 'l't,
where A is the proportionality factor, which can be determined from
the condition
2rt :t/2
1 Jf
0
d<p
-n/2
(<p, 'fr) d{} = 1~

and hence A = 1/(4n). Thus


0 :::;;;; <p ~ 2n
- n/2 ~ iJ:::;;;; n/2.
6.25*. On the hypothesis of the preceding problem we consider a
plane P which is parallel to the equatorial plane (in which the angle <p
IS read off) and passes through a point 0' which has spherical coordinates
r = 1, <p = 0, iJ = n/2 (Fig. 6.25). The plane has a system of Cartesian
coordinates xO'y. All the a-particles which fly in the upper hemisphere
of the scatter get onto the plane P .We consider one of these particles and
the corresponding random variables which are the coordinates X Y
of the point at which the ex-particle falls on the plane P. Find the j~int
H•
164 Applu:d Probltmt tn Probabfltty Theory

probabll1ty dcns1ty funct1on I


(x, y) of these random var1ables Are
random varr'tbles X nnd Y mutually dependent?
Sol ut1on \Ve seek the proba btll t y element 1 (.r, y) dx dy, wh1ch J..S
appro~1ma tely equal to the probability that the particle Will fa1l 1n
the surface element dx dy adJOining tha
z P po1nt {.r, y) 'Ve calculate th1s proha
oY b1l1ty as ,ve d1d In the precedu1g prob
lem \Ve find the area ds of the portton
of tho unLt sphere C such that the "Qar
ttcles pass1ng 1t fall In the !nrface
element dx dy ThiS portton ts the central
proJeCtlon of the surfa.ce eloment dr dy
onto the sphere C The pro]ectton IS
1J
found by multtplyLng the sur(ace ele
ment by the cosine of the angle {)- be
t'vcen tho d ltection of proJeCting and
tho plane P In add1t1on, the surface
f1g 6 25 decrca~es 1n Inverse proport1on to the
square of tha dt.stanee R from the centre
of proJeCtion The surface element d'; on the sphere C
tb: d 11 t
ds == dz dy cos{} _ V 1 +zlil+ v1 _ dz dy
Rl - 1+zl+y• - (t+.r'+r..J'/a
To obtatn the probabrlxty element,- \\e must divtde ds by the area of
the \vhole upper hemtsphere 1vhtch IS 2n:: \Ve obtain
f(:r y} d:rdv= 2~ (1~:~~r')Sf~ or I (:r, y) =
(i+!a+v'JI/3 2lt
The random varrables X andY arc mutually dependent s1nce th~1r JOtnt
probabtltty denstty function does not decompo9e 1nto the product of
two functtons, one of which depends only on x and the othet only on Y

!J fj(r}
2 ---

r r 1
0
0 (!J)
0 1 r
(a)
(c)
rtg 6 26
6 26 A random potnt A represents an obJect on a crrcular radar screen
of un1 t rad1 us (F1g 6 26a) and has a untform d1.strtbu t1on Wl th1n that
c1rcle Ftnd tha JOint probabtltty dens1ty funct1on f (r. q>) of the polar
coordtnates (R, tl>) of the po1nt A Are tho random varJables R and til
dependent?
Ch. 6. Systems of Random Variables 165

Solution. Let us consider, in polar coordinate s, an elementar y "rec-


tangle", which correspond s to infinitesim al increment s dr, dcp of the
polar coordinate s r, cp of the p'oint inside the circle (Fig. 6.26b). Its
area (to within higher-ord er infinitesim als) is r dr dcp. Dividing this by
the area of the circle, which is equal to n, we get the probabilit y element
f (r 1 cp) dr dcp = rdrdcpln,.
whence
f (r2 cp) = rln for 0 < r < 11 0 < cp < 2n. (6.26.1)
We find the probabilit y density function / 1 (r) by integratin g (6.26.1}
in the whole range of variation of cp, i.e.
2:t
fdr)-= ~ r
:t
dcp= 2r for
0
Similarly •

1
r 1
/2 (cp) = ~\ :t
- dr = -2- n for 0 <cp < 2n.
0

The graph of the distributio n / 1 (r) is aright triangle (Fig.6. 26c), the graph
of the distributio n / 2 (cp) is uniform over (0, 2n).
Multiplyin g / 1 (r) by / 2 (cp), we get the joint probabilit y density
1 (r, cp) of the system (R, <D), and consequen tly, R and <P are independ-
ent. Note that for the same uniform distributio n, the Cartesian coor-
dinates X, Y of a point in the interior of a circle are mutually depend-
ent. We proved this while solving Problem 6.8.
6.27. On the hypothesi s of the preceding problem, the radius of the
screen is not unity but a. Write the expression for the joint probabilit y
density of the polar coordinate s of the point A and their separate den-
sities.
Solution. Dividing the area of the elementar y "rectangle " by that of
the screen na2 and cancelling by dr dcp, we obtain
r
f(r, cp)= :~:all for O<r<a ,
0<cp<2 n,
2
ft (r) = a2 for O<r<a ,
1
/2 (cp) = 2n for 0<cp<2 n.
CHAPTER 7

Numerical Characteristics of Functions


of Random Variables

7"0. Oaa of tb.a rno.st efficient means of solv1ng prohab1Uty _problew 1! to u..~
numenc1l charactat1~ttes Th9 character1stres ot random variables of tnterest c:an
then be fonnd Without involv1n~ \beJr d•str1bution3 In partJculart it is not. nee~
~ary to know tb. a d~ t r lb uti on~ ~ l th.e Cune t La n! o ( r a nda rn. vatta.hles in crd er to
.find the1t eharactertstlcs It ts surfi.c1ent to know tba dtstr1huhoxu of the s.rgument
11 X b a cla.screte random varHtb)a \Vtth an ordere4 sar1es
"• tr1 1 z 1 ~ J %n n
X ( }J Pr=~i} •
p, J P~ J
1
6t- • t Pn. f .-.1

and \be varra.blFn Y and X 3ril 10 .a tunct1oull relat1onsb•p Y - rp (X). then the.
mean value of the variable Y 19
.,.. ....

m11 =If (fP (X)J = 2j 'P (zd Ph (7 0.1


t=-l
and the var1ance- can he expressed! .by one ol two f otm ulas • either
n
Var11 = Var trp (X)}= ~ {cp (.r,)~1tlu] 1 PJtl
; ..,. {1 0~}

Varv ~ 2J" (t.p (r,)J 1 PI- m~. 1(7 0 3)


t:aJ

If (X, Y) Js a system o! d1screte random. var1able:t whose d1.9.trlbut1on 19 eharw


aetenzed by the prohabihtu~~
PU :=:t P {X ~ z,~ Y ==- YJ),

and Z == tp (X 1 Y), then the mean valu9 of Z 1s


m,_$131\{ [q- (X, Y)] = 3 ~ ~ (Xh .UJ)t Plh !(7 0.4}
' 1
a lld tha vartan.ce can be ex press-ad h y one of the. two formulas, et tb.et
Var11 ==Var lfP (X. Y)l = 2j ~ 1((' (zl, YJ) -m,J1 Plb (7 0 5)
t j
ar
'Var11 = iJ ij [tp (zl, YJ)l'~ PlJ ..... m,~ (7 0 6)
' J
If X ts a continuous random vartable With probab1hty densaty I (r). and Y ~
q> (X). then the mean value of Y u
..,
my= 1\f fcp (X)J = 5 !p (~)/ (z) dz, {7 () T)
-oo
Ch. 7. Characteristics of Functions of Rand om Variables 167

and the varia nce can he expre ssed by one of the two form ulas, eithe r

1
00

Vary =Va r[q> (X)] = [q>( x)-m y]Zf (x)dx , (7.0.8)


-oo
or

1
00

Vary = [q> (x)] 2 f (x) dx-m~. (7.0.9)


-co

If (X, Y) is a syste m of conti nuou s rand om varia bles with the joint prob abili ty
density f (x, y), and Z = q> (X, Y), then the mean value of Z is
00

mz = M [q> (X, Y)] = ~ ~ q> (x, y) f (x, Y) dx dy (7 .0.:10)


-co

and the varia nce can be expre ssed by one of the two form ulas, eithe r
00

Varz = Var [q> (X, Y)] = ~ ) [q> (x, y)- mz] 2 f (x, y) dx dy. (7.0.:1:1)
-oo
or
co

Varz = )
-oo
J[q> (x, y)] 2 I (x, y) dz dy-m i. (7 .0.:12)

If (X11 Xn) is a syste m of n conti nuou s rand om varia bles with dens ity
•.• ,
j (zl, • ., xn), and Y = q> (Xt, ••• , Xn), then the mean value of Y is
my= M[q> (X1 , ••• , Xn)l
co

=
rJ .. . Jr q> (x
00
(n)
1, • •, Xn) f (x1 , • (7.0.13)
-oo -oo

and the varia nce can be expre ssed by one of the two form ulas, eithe r
Vary =Var [q>{ X 11 . . . , Xn)]

r
00
(n) r
00

= J ··• J [q> (xl, · (7 .0.14)


-oo -oo
or
00 00
r (n) •
Vary = J .. . ) [q> (x11 • (7 .0.15)
-oo -oo

In some cases, we do not even need to know the distr ibuti ons of the argu ment s
hut only their chara cteris tics in order to get the chara cteri stics of funct ions. Here
are the funda ment al theor ems on chara cteri stics .
i. If c is a nonr ando m varia ble, then
M [c) = c, Var [c) = 0. (7 .0.:16)

2. If c is a nonr ando m varia ble and X is a rando m varia ble, then


M [eX] = eM IX] Var [eX] = c2 Var [X] (7.0.17)
t 68 A pplltd Prohl em' In Pro fla!Jlllty Tl&tor v

3 The addttl on theorem for mean vaJoes Tl"' tn~an val~ c/ a 1u1n D( random
uarlab ltr u ,qual to tA~ 1um of tbtlr m~an. values
~~ [X +
Y) ::!11: AI (X] AI!YJ, +
(7 0 18)
ell d-. tn gene:a],.
n n
ll ( 2J X 1) == lJ Af CXtJ .. {7 0 19)
i.al ·-·
4 The mean Yalua of a Ju1ear funct•on of eeveral random Yar1allles

where a 1 and lJ are nonrandom ecef.ficients Js equal to the !aJlle bnear functton ol
\heir mean valuea
'ft 1\

tm 11 ~lr [L1 DtXt +b I== }]. tJ 1m~1 +b


-.~J j$1
(7 0 20)

where ms1 = ltf (X tl ibis .rule can be \\flU en w a. copcfee Iorm vu


II [L (K1 X~ Xnlt ~ .L (m,.1,. m.r1 m:.nl, (1 0 21)

where L 1s a l1nea.r funct!On


5 ThQ mean value of t.he produe1.. af t-v.o randn m .,.arulbles X l\ttd Y ts ex
prnsged /by the formula
lU {XYJ ==At (XI II (Yl Covxv• +
(7 0 22}
wher-e Co"~v1! the eQ"lanance of the Tar!ables X and Y ~b1s formula t&n be telrnt-
teu as [oJiows
Covz:!# = !! [Xl1 - m~m 11 (7 0 23J
cr• be-at1ng 1n mind that. l\1 [X Y] ~ a,lt •f X Y] as
Co v~Y t = et:t1 1X Y] - m~m11 • (1 0.2.4)

6 The mulhp hcaho n theDrem for mean values Tht mean z..al'll~ oJIAe pttJdud
of two r.ouorrllllted random r-ariablez X Y ls equal to tht prculu.ct ~1 t1ult mto.n vo:!utl
~f (XYJ ~ 1\I [X] Af (Y] (1 0 25)

7 ll X1 Xs X n. are 1nd epen d ~.n t rand om \"v,r1ehl es then the mean value
of their product Is equal to the produ ct oi their mean values

l\t f rJ
t=--1
n.
X 1] = n
l'l

~[
!tt 1X tl• (7 0 26)

S. The "Variance ~~the sum of t'\\o rando m vc;u·IalJles 1S ex}lressed 'by th& icnnula.
VarIX +
Y] == Var !XJ Var l¥1 +
2Covz11,. +
(1 0 21)

9 The var1ance of the sum. of sc•era l random v'nab les Is expressed by the lor"'
mula
n n.
Var [~ XtJ== 2J Var [X,]+ 2 2J Covzt:tl (7 0 28)
~=J f:e:l i<J
where Cav~ 1 ;r1 Js the covariance of the random vartab les XJ and X1•
Ch. 7. Characteristics of Functions of Random Variables 169·

10. Tho addition theorem for variances. The variance of the sum of two uncorre-
lated random variables X and Y is equal to the sum of their variance.~:
Var [X Y] = Var [X] + Var [Y], + (7.0.29)

and, in general, for uncorrelated random variables


11 n
Var [~Xi]"""'~ Var [XiJ. (7 .0.30)·
i=i i=i

11. The variance of a linear junction of several random variables, i.e.


11
Y ~ :2} aiXi+b,
i=1

where a 1 and b are nonrandom variables, is expressed by the formula


n n
Vary= Var [ ~ aiXi+b] = 2j ai Var [Xil+2 ~ aiaJ Cov x xr (7 .0.31}·
i=1 i=i i<; '
When tho variables Xu X a, • s ., Xll are uncorrolated, we have
n n
Var 11 = Var [ ~ aiXt+b] = ~ a{ Var [Xi), (7 .0.32}'
i=1 i=t

12. When several uncorrelated random vectors are added, their covariances are·
added, e.g. if
X= X1 +X 21 Y = Yi +Y 2; Covxtxi = Covx,y, = CovY•Y• = Covy,x. = 0,
then
Covxy=Covx,y, + Covx.yj (7.0.33)
Linearization of functions. The function IJl (X1 , X 2 , • • • , X 11 ) of several ran-
dom ar~uments X1 , X 2 , • • • , X 11 is said to be almost linear if it can be linearized
~approximated by a linear function) with an accuracy sufficient for applications
Ihn the whole range of the practically possible values of the arguments. This means-
t at
n
Xn) ~·ljl Cmxt'mx 2 . ,m.x 11 )+ ~ ( !! )m (Xt-mx 1), (7 .0.34).
i=1
where
oljl ) = Oljl (mxp m.x2 , ••• , mx 11 )
( o:r:i m o:r:i

is the partial derivative of the function IJl (x 1 , x 2 , • • • , xn) with respect to the-
ar~hent xt in which each argument is replaced hy its mean value.
e. mean value of the almost linear junction Y = IJl (X1 , X 2 , • • • , X 11 ) can he
approXImated by
' mx ). (7.0.35)
n
The vartance of an almost linear junction can he approximated by

(7 .0.36)
1.70 Applftcl Problemr In Praba'bflltv Theory

where Varx1 is the var1ance of the random variable X, and Covz,~J :ls the covarlanee
-ol tbo random var1ables X 1 and XI
When the random arguments X1 X 1 Xn are uneorrelated. then

Problents and Exerctse~

7 1 Tb~ d1serete random vartabla X bas an ordered ser1es

X
-1.1 0 l1 ~ 2
o 2 r o t 'o.a 1a '
F1nd the mean value and var1ance of the random varrable Y = 2X
Solut10n my == 2 1 X 0 2 + 2o X 0 1 21 X 0 3 +
2'1 X 0 4 = +
2 4 and Varv = a 2 [YI- m; = (2 1)2 x 0 2 (20)2 x0 1+(21)' X 0 3+ +
{2 2) 2 X 0 4 - 2 42 = 1 99
7 2 A eont1nuous random var1ahle X 1S d1str1buted 1n the 1nterval
(0 1) \V1th density f (x) = 2x for x E (0 1) {F1g 7 2) Ftnd the mean
value and vatiance of the square ot thta random
f(:r} variable Y = X2
2 ----
Solution
I {

1
m11 =a 2 [XJ= j z22:edx==-
(l
2

0
Var,- a 2 !Yl - m~- ~ (z )1; 2x d.x -
2
(-}}
2
=
1
12
0
Fig 7 2 7 3 A random var1able X has an exponenttal
distr1but1on With probab1l1ty density I {x) =i-.e ~
1or x > 0 (1 > 0) F1nd the mean value and variance of the random
' ar1 ab1e Y :; ; : : e .:t
Solution

DO

V ar11 = o:2 I Yl- m~ - 5o-'.I."'Ae"i'


0
.rdx ..-- [ -x!. J)
2
= -1~A.""""'!"+--=2~~J~:\-:-+~1.":"")
7 4. A random variable X has an exponential distributton w1th pro-
babllt ty dens1 ty f (x) = A-e >.x for x > 0 De terrntne the condt ttons
under which the mean value and vartanc.e of the random variable Y =
eX ex•st and find what they are equal to
Ch. 7. Chara cterist ics of Funct ions of Rando m Variab les 171

Solution
00 00

my= Jext..e-'Axdx='A ~ e-<A.-1)xdx for /..-1 >0.


0 0

i.e. for ').. > 1 this integ ral exists and is my ='AI ('A -1) for /.. ~ 1 it
diverges, i.e.
co co

a 2 [Y] = Jezxt..e-"x dx = ').. Je-<"- 2)x dx


0 0

For').. > 2 this integ ral exists and is')../(/.. - 2) and the varia nce Vary =
f../(f..- 2) - [/../(/.. - 1)] 2 ; for ').. ~ 2 the integ ral diverges and the
variance Vary does not exist.
IY !J

0
-~
2 0 1 :c
Fig. 7.5 Fig. 7.7 Fig. 7.8

7.5. A rando m varia ble X is distri buted with proba bility densi ty
f (x) = 0.5 cos x for x E (-rr./ 2, rr./2) (Fig. 7.5). Find the mean value
and variance of the rando m varia ble Y = sin X.
Solution.
n.t2
my=
1
2 ~ sin x cos x dx = 0,
-'lf/2
rl/2

J sin 2
xcos xdx= ~ .
j-lt/2

. 7.6. A rando m varia ble X has the same distri butio n as in the prece d·
mg problem. Find the mean value and varia nce of the rando m varia ble
Y = I sin X I·
Solution,
n/2 n/2
1 1 ,
my= 2 \• l sin x j cos x dx = Jr sin x cos x dx = 2
-:t/2 0
lt/2 n./2

1 l 2
I sinx cosx dx= J sin
0
2 xcos xdx= !,
- :t/2
t 72 Appll td Probl~ma In Probabllttu Theory

7 7 A rar dom po1nt (.! Y} 1s unifo rmly dJ.str1buted 1n the 1nter1or


of a Circle K of rsd1us r = 1 (fJg 7 7) F1nd the menn value and lail
ance of the rando m var1a hle Z = XY
Solut ion
1(x, y) = 1/n for (x y) EK,

!- j Jxv dxdy=0
ma-=
(K)
1

Varz = ! JJ:r: dz dy 2 2
y
(K)
21t il
=! \ a~ r cos q>sm rp
a
drp 11 2 2
dr= 2~
7 8 A rando m potnt (X Y) JS unifo rmly dtstrJ buted in the tntertor of
a squar e R (F1g 7 8) Ftnd the mean value and var1a nce of the random
varta ble Z = XY
Solul ton Stncc the random var1ahles X and Y are mutu ally Inde-
pende nt ~e bave
mz = m~m 11 == (1/2) (1/2) ~ 1/4
Var, =a! JZI- m: = l\f[(X Y)2J - == 1\f (X2 Jl\1 IY 2J m: -m:t
~llX 2 ] =a., {Xl = 113 ~11Y2] = 113, Var1 = 7/144
7 9 Two rando m 'ar1a bles X and l are relate d as Y = 2 - 3X
Tho nume rical chara cterls tJrs of the var1a hle X are g1ven, 1 e m:t = -1
and Var: = 4 Ftnd ('t) the mea-n '\ a.lue and' artan ce of the vttrmble Y,
and (2) the covar iance and corre lation c()efficJent of the var1ables X
andY
Solut ion
(1) m 11 =2-3m~=5 Vnr11 =(-3 )1 X 4=26 ,
(2) Cov~v=l\1 (XYl-m~mJI=l\l{X(2~3X)l+1 x5
= 2l-ri f X]- 3~f [X2] 5, +
J.rl IX2] = a 2 (XJ =- Var;t +m: == 4 + 1 = 5
\Vhence Cov~u:= -2- 3x 5+5 = -1.2
r;r 11 = -12/ (a:ta11) = -12/ V 4 X 36 = -1

'\\htc h JS quite natur al stnce X and Yare 1n a l1nea r funct ional relatron
~hJp wtth a negat ive cocfftctent tn X
7 10 A syste m of rando m varia bles (A. Y Z) bas the mean values
Tnz m. 11 and mz and the corre lation matr1.x

Var:t
Ch. 7. Characteris tics of Functions of Random Variables 173

Find the mean value and variance of the random variable U = aX -


bY+ cZ- d.
Answer. mu = amx- bmy +cmz- d, Varu = a 2 Varx b2 Vary+ +
c2Varz- 2ab Covxy + 2ac Cov xz- 2bc Covyz·
7.11. An n-dimensi onal random vector X = {X 1 , X 2 , • • • , Xn)
is made up of n random variables Xi with mean values mx1 (i = 1, ... ,
n) and variances Varx 1 (i = 1, ... , n), and has a normalize d cor-
relation matrix II rx 1::e II (i = 1, 2, ... , n, j > i). The random vector X
1
is transforme d into an m-dimens ional random vector Y = (Y1 , Y 2 , • • • ,
Y m), the componen ts of the vector Y resulting from the componen ts of the
vector X upon linear transform ations:
n ' '

Yk= 2J
i=i
a 1kX1 +bk ~k=1, 2, ..• ,m).

Find the characteri stics of the random vector Y, i.e. the mean value
my 11 (k = 1, ... , m), the variance Varyk (k = 1, ... , m) and the
elements of the normalize d correlatio n matrix II r YJt.YI II (l = 1, 2, .•• ,,
m; k < l).
Answer.
n
myh = 2J
i=i
a1hmx1 +bh (k= 1, ... , m),
rn
Varyk= 2J
i=1
afhVarx 1 +2 2Jrx.x-VV arx.Varx. ,
i<j l 1 l }

ruhu, = Covykl!./ lfVaryh Vary 1,


where
n
CovYklll = .2J ct.ikail Varx1 + .2J. (aihal + alhail) rx 1x. VVarx. Varx 1•
t=i t<3 } l

7.12. There are two mutually independe nt random variables X


and Y. The variable X has a normal distributio n It (x) =
2
1
v-
2n
e
_(x-1)2
B

a~d the variable Y has a uniform distributio n in the interval (0, 2).
F!!:d (1) M [X +
Y], (2) M [XYJ, (3) M [X 2 ], (4) MIX - PJ, (5)
Var [X + Y] and (6) Var [X- Y].
Solution.
(1) M [X+ Y] = M [X]+ M [Y] = 1 + 1 = 21
(2) M [XY] = M [X] M [Y] = 1 x 1 = 1,
(3) M [X2 ] = ct. 2 [XJ = Var [XI + m.~ = 4 + 1 = 5,
(4) M [X - 1'"2] = M [X] - M [1'"2] = 1 - a 2 [Y] = -1/3 1
(5) Var [X +
YI = Var [XJ +
Var [YJ = 4 1/3 = 13/3,+
(6) Var [X- Y] = Var [X] +
(-1) 2 Var [Y] = 13/3.
t74 Appll~d Problem• In ProbabllUy Theory_

7 ~~13 A random variable .Y. has a normal distribntton I (x} =


::r1:
~ _ e -2u~·. Fmd tho mean yalue of the random variable Y =
0' 2n
t-3X2+4X3•
+
Solutiont m == .1\f I1 - 3X2 + 4X3 J = 1- ar.,I lXZI 41.1 CX1]. Stnca
mz = oto It Ioflows that 1\l IX 2 } = o2i for our normal distrlbUtton
AI fX 3 J= Ot hence m, = 1-3aZ
7.14. The Independent random 'ar1ables X ttnd Y ba,e d1str1bu-
ttons f 1 (.x) and / 2 (y) the graphs of whoc:e probabtltty derunty func-

fi(r}

0 a ;r 0 IJ X
(a) (h)
X
Fig 7 tG

t1ons are shown In F1g 7 14a, b Ftnd (1) J.,I {X+ Y], {2) Var(3X_.
+
6Y 11\ (3) ~~ [XYI1 (4) ~~ {2XY -3X2+ y2 __ t l
Solul1on. 1\1 (X] = 2aJ3, l\1 (YJ = b/3. Var [.X J == a 2/18, Var [Y] ::=
b2 /18
(1) l\l{X+YJ =(2a+b)/3~
(2) Var (3X- 6Y + 11 = 9 Var~ + 36 Var11 = a2/2 2b 2 , +
(3) l\1 [XYJ = 2ab/9
(4) 1\I [2XY- 3X2 +} 2 -1] = 2~1 IXY]- 3a 2 (XJ +cx 2 (YJ- t
+
= 4ab/t;)- 3a2 /2 b't}fj -141
7 15 Answer questtons 1 3 of the preced1ng problem gl"Ven that the
vartables X and Y are mutually dependent and the1r correlation coef41'
frc1ent rxll = -0 9
Solutlon41 (1) l'rl [X + })'"]
= (2a + b)/3, (2) Var [3X - 6Y + 11 =
a'!/2 +
2b 2 + (36ab!JI18x18) 0 9=a 2/2+2b 2 +1 Bab, (3) .1\f fXY]=
2abl9 - 0 9ab/18 = 31ab/180
7 .16~~ The ends of a ruler AB of length l slrde along the stdes of a r1ght
angle ~Dy~ The ruler occupies a random posttion shown Jn Fig 7 16 w1th
all the values of the absc1ssa X of 1ts end A on the z-ax1s 1n the Interval
from 0 to l be1ng equally probable F1nd the mean value of the dJS·
tance R from the or1g1n to the ruler
Solutton41 The random variable X lS uniformly distributed 1n the
1nterval (0~ l), f (z) =
ill for :c E (0, l) The random vartable R can
he expressed 1n terms of X (see Ftg" 7.16), R = Xlfi - (XIl) 1 ~ Its
Ch. 7. Characteristics of Functions of Random Variables 175

mean value
l

mr= ~ )
0
xy 1- (y)
- 2
dx=s·
1

7.17. The random variables V and [J are in a linear relationship with


the random variables X and Y, i.e. V = aX + bY + c and U =
dX + +g.
jY The numerical characteristics mx, my, Varx, Vary~
Cov xy of the system of the random variables X and Y are known. Find
the characteristics of the system of random variables V, U, i.e. mv, mu,.
Varv, Varu, Covvu' rvu·
Solution.
mv = amx + bmy + C1 Varv = a 2 Varx + b2 Vary+ 2ab Covxy'
mu = dmx + fmu + g, Varu = d2 Varx + f2 Var11 + 2df Covxy·
Furthermore,
+ fY,
0 0 0 0 0

V =aX+ bY, U = dX
0 0

Covvu = M [VU] = adVarx+ bfVary+ (af + bd) Covx11 ,


rvu-= Covvul VVarv Varu
7.18. An analytical balance is used to weigh an object. The actual
(unknown) mass of the object is a. Because of errors, the result of each
weighing is accidental and has a normal distribution with parameters a
and cr. To reduce the errors, the object is weighed n times, and the arith-
metic mean of the results of n weighings is taken as an approximate va-
n
lue of the mass, i.e. Y (n) =-/; ~X 1 • (1) Find the mean value and the
!=1
mean square deviation. (2) How many times must the object be weighed
to reduce the mean square error of the mass ten times?
n
Solution. (1) M [Y (n)] = ~ ~ M [X1j. Since the object is weighed
i=i
under the same conditions every time, it follows that M [X1] =a for
•my i, and then •
n
1
M [Y (n)] = "V
n L.J
a= na
n
=a.
i=i
Ass~ming the errors in each weighing to be independent, we find the
varwnce of Y (n):
n n
2
Var [Y (n)] = \ "V Var [X,] = \ "V dl. =
nLJ nLJ
ncr:
n
= cr
n'
i=1 i==1
(2} The number of weighings n can be found from the condition
V
cr [Y (n)] = cr 2/n = cr/Vn
= cr/10. Hence n = 100
176 Applied Probl~m1 In Probabi lity T__A_t_or..;.v_ _ _ _ _ _ _ _ _ _ __

7. t 9. A 1um1 no us spot represent ang a target being traeke d on a cu-


cular radar screen can acc1de nta.lly oceupy any plaea on thf- set~n (th~
probab lltty density 1s consta nt). The diamet er of the screen is D .. F1nd
thn mean value of the dtstanee R frnm the luminous spot to the ~ntt&
of the screen
+
Solution. R = 1fX 2 y:~ where (X, Y) is a system. of random van..
.abies untform ly dtstr1buted in the ctrcle Kn of diameter D:
f (r, y) = 4/ (ttD") for (:r, y) EKo
m,=M rRl
<Ko~
JJVx +V' n~rdzdy
3

()rt passing to polar coordtnates (r, <p}


211 D/2
mr'""" Jt~i ) diP ) r 1
dr ~ ~
0 0

, .20 Each of t\VO potnts X and Y lll.ay Indepe ndently occupy an


acc1dental p~s1t1on on the Jntervn l (0, 1) of the absciss a axis (Fig. 7"20a).

R
X~" t\ .,Y
-0 •
t
• '1r.. 0 1 :z
fa) (b)
F1g. 7.,m

the probab tltty den.stty on th.e tnte-r:\'al bt\\\g eons-taut fa~ th.e two va"'
r1ahles F1nd the mean "Value of the distanc e R between the points and
the square of the dtstance betwee n them .
Soluho n. \Ve have R == I Y - X ,, m.,.. = l\I [ r Y- X JJ . \Ve r~
present the system of random var1ables (X,. Y) as a random point on
the x, y-pla ne (F1g. 7 20b) dtstr1bu ted WIth consta nt density f (x, y) = f
tn a square w1th a side equal to untty~ In the domain D 1 " X> Y,
I Y - X J = X - Y. In the domain D 2 • Y > X; I Y- X r = Y- X.
mr = JJ(x- !I) dx dy+ JJ(y-x) dx dy
lDt) (Dtl
I X t V
= ~ d:t s{:t-y) dy+ j dy s(y-.r) d.x= -!~
0 0 0 0
l\1 IR
2
J=hll) Y -X l'l ~a, [YJ +~ IXJ -2rn~m11
= 2 (Varx + m~l- 2m:== 1/6~
Ch. 7. Characteristics of Functions of Random Variables 177

7.21. There is a unit square K (Fig. 7.21). Points X and Y fall at


random and independently of each other on the adjacent sides of the
square. Each point is uniformly distributed within the limits of the
respective side ..Find the mean value of the square of the distance be-
tween them.
Solution. R 2 = X 2 +
P; M [R 2 ] = a 2 [X] +
CG 2 [Y] = 2/3.
7.22. The conditions of the preceding problem are changed so that
the points X and Y fall not on the adjacent but on the opposite sides

I 1

K
y
X 1 R >1

Fig. 7.21 Fig. 7.22

of the square (Fig. 7.22). Find the mean value of the square ofthe dis-
tance, between X and Y.
Solution. R 2 = 1 +
(Y - X)2·; M IR 2 ] = 1. a 2 [Y] +
a 2 [X] - +
2M [X] M [Y] = 1 +
2/3- 1./2 = 7/6.
7.23. The conditions of the preceding problems (7.21. and 7.22) are
changed so that the points X and Y may accidentally and independently
of each other occupy, with constant probability density, any positions
on the perimeter of the square K. Find the mean value of the square of
the distance between them.
Solution. Let us choose three hypotheses:
H1 = {the points X and Y fall on the same side of the square};
'
H2 = {the points X and Y fall on the adjacent sides of the square};
Ha = {the points X andY fall on the opposite sides of the square}.
The mean value of the variable R 2 can be found by the complete
mathematical expectation formula
' IR 2 l = P (H1) M [R 2 I H1 ] + P (H 2) M [R 2 I H2l
+ P (Ha) M [R 2 I Hal,
wh~re M lR 2 I H 1 ], M [R 2 I H 2 ], M [R 2 I Hal are the conditional expec-
tations of the variable R 2 on the corresponding hypotheses.
From the solutions of Problems 7.20, 7.21. and 7.22 we have
111 [R 2 I H 1 1 = 1/6, M [R 2 I H 2] = 2/3, M [R 2 I Hal = 7/6.
12-0575
180 A pplf~d Proble m.t In Probabil t y Thtory

value equal to the probab1l1ty of a btt


'l (X,l = p 1 = 26li(Ln).
whence
n
~ 2.n ~l 2l
~[(X)= LJ PI = Ln = Ln-
t~t

7 28 Any contour (conve't or noncon' ex open or closed) of length j


13 thro,vn at random on n plane ruled \VIth parallel stra1ght ltnes ap-
peartng 1n the preceding problem F1nd the mean value of the numbe1
of ttmes the contour and the ltnes 1ntersect
SolutJon As 2n tba preced1ng problem 1\I lYJ == 21/(L~) To prou
th1s we must d1 vtde the con tour tn to n elementary. pract 1cally stra1gh.t

,
l/
/
J.f

>
<
l l L "' ..... ,_.. .. ,-'loc
""'
L L
rig 1 29 Fig 7 30

secttons of length &l For each of them the expectation of the number of
b1ts 1s 26.li(L"t} and for the \vhoie contour tl ts 21/(Ltt)
7 29 A convex: closed contour of length l \vhose largest d1mens1on 4
does not exceed L IS thrown at random on a plane ruled w1th parallel
straight l1nes L apart (F1g 7 29) F1nd the probab1l1ty that 1t wllltn
tersect a ltne
Solution 'Ve destgnate the requtred probabdzty as p y be1ng the
number of -po1nts of 1ntersect1on of the contour and a I 1ne Stnce the
contour JS convex and closed and 1ts largest dtmenston 15 sm.aller than
L 1t may etthcr Intersect the l1ne$ twtce or not at all The ordered
ser1es of the random varrable Y IS
0 l 2
-..at ....At i-pi p
Ch. 7. Characteristics of Functio ns of Random Variable s 181

Solutio n. (a) Let us conside r the stra!gh t lines whic~ bound the r~c­
tangles to be two system s of lines, horizo ntal and vertica l. We consid-
er the followi ng events:
A = {the needle cuts a vertica l line},
B = {the needle cuts a horizo ntal line}.
Since the positio n of the needle with respect to the vertica l lines
does not affect its positio n with respect to the horizo ntal lines, the
events A and B are disjoin t; therefo re, the require d probab ility is
P (A +B) = P (A) +
P (B)- P (A) P (B).
On the basis of Buffon 's problem , P (A) = 2l/(nL) , P (B) = 2l/(nM ),
whence
P (A +
B) = 2ll(nL) +
2ll(nM )- 4P/(n 2LM).
(b) The random variabl e X = 1 X +X
2 , where X I and X 2 are the num-
bers of interse ctions betwee n the needle and a horizo ntal and a vertica l
line respect ively.
Dividin g the needle of an arbitra ry length l into elemen tary sec-
tions, as before, we have
M [X) = M [X 1 ] + l\1 [X 2] = 2l/(nL) + 2l/(nM ).
7 .31. A rectang le of size l 1 X 12 is thrown on a plane at random
(Fig. 7 .31); all the values of the angle 6 are equipro bable. Find the
mean value of the length X of its project ion onto the x-axis.

I
I
I
I L1lz
I I I I
I X1 I
1~1,
-
"X2 J
-- I
I LJX I I
I I 'bl
D :c D
X X
Fig. 7.31 Fig. 7.32 •

s.oht~ion. 'We represe nt X as a sum X XI = +


x2, where xl is the
proJect wn of the section II and X 2 is the project ion of the section l 2
The require d mean value •

1\I [X] = l\1 [XI] + l\1 IX 2] = 2Zl/rt + 212/:rr = 2 (Zl + 12)/:rr,


• • •
I.e,.: It IS equal to the perime ter of the rectang le divided by :r.
1.32. A convex closed contou r of length l is thrown at random on
a plane, all the orienta tions being equipro bable (Fig. 7.32). Find the
mean value of the length X of its project ion onto the x-axis. ·
178 Appl t(d Ptobl~mr tn Prob abili ty Theo ry

\Ve can find the prob ahlli t1es of the hypotl1eses, 1 e P (Ht) = 1/4
P (H1.) = 1/2 and P (l/3 ) = 1/4 Hence
f 1 1 2 1 7 2
~l (R!J = 4 X b + 2~ -:r+7; X 6~3·
7. 24. A 1 ando m ' ar 1able- X has n prob ah1l 1l y dens l ty f {x) \' e con
.srder the func tion Y = mtn {X, a}, '\he re a 1s a nonrandom \arlable
Ftnd tile mea n value and 'ar1 ance of the rand om var1 able Y "1tbout
reso rttng to tts ulStrJbutJon
Solu tion . B} tho general form ula (7 0 7)

I mm{x,
GO

i\l{Y 1= a.}f (x}d x


-QID

For x<a , "e get m1n{.x, a}= x for x~a . Vt-e get mtn{.t, a)=a
Hen ce
~ Q

Jxt(.r)dz+a J/(x)dx= Jxf(x)dx+aP{X>a}


~

\I{} J=
-~ Q -~
(7 24 ·I)
By anal ogy we find the seco nd 1nom ent abou t the or1g1n
a foo o

tt2 !Y) = ~ z1.f (x)d x+a "1 f(x )dx =) x~/(x)dx+a1.P{X>a}


-ro a -~
(7.. 24 2)
and the vari ance
Var IY] ~ a. 2 {1 1 - (~I [J'"l)~
7--.25 The sam e ques t1on a.s 1n the prec edln g prob lem, . but X 1s a dis-
crete rand om v ar1 able assu ming pos1 t1 ve tnt egrnl 'al ue~ 1\ 1th the pro
bahi1llies defin ed by the orde red serJes
X . t l 2 I I " I I n Y= mw {X, a},
Pt J Pa I ( PA I f Pn t

a JS a nonr ando m post ttve Inte ger Incl uded bet\\ een 1 and n (1 < a < n)
Tt.

Solu hon l\1 [Yl = ~ m1n {k~ a} p~,. for k <a, \\e get mtn {k, a}=
11=-1
k, for J..~a, 've get mtn {k a}= a The n
a-t n
J\llYJ ~ ~ Ap~t +a ~ p,.
.\:;;;1 .l~

By anal ogy, \\e find the seco nd mom ent abou t the orJgln
a-1 n
tt 1 {Yj ~ ~ h_.2pk +a2 ~ p.,
A~l k=tl
Ch. 7. Charact eristics of Functio ns of Random Variable s 179

and the varianc e


V ar [Y] = a 2 [Y] - (M [Y]) 2 •
7.26. A numbe r of indepe ndent trials are carried out in each of which
an event A may occur with probab ility p. The trials are termin ated as
soon as the event A occurs and the total numbe r of trials must not
exceed N. Find the mean value and varianc e of the numbe r Y of trials
that will be made.
Solution. We first conside r a random variabl e X which is the numbe r
of trials if the restrict ion N concer ning the total numbe r of trials is
removed. The random variabl e X has a geome tric distrib ution begin-
ning with unity and its ordered series is
1 I 2 I ... I k I... -1
X: P I pq I · · · I pqk-l I · · · (q - - P).

The random variabl e Y is the minim um of X and N: Y =min {X, N}.


On the basis of the results of Proble m 7.25 we have (setting n = oo)
N-1 oo N-1 oo

M[Y]= :lJ kpk+N 2j


k=N
Ph= 2j
k=i
kpqk- 1+N 2j
k=N
pqk-t
k=i
N-1 oo

= p{ 2j kqk-1 + N 2j qk-1} .
k=i k=N

We calculate the first sum


N-1 N-1 N N 1 N
1
1
"\:1 k k-!= "V d qR = d q-q = 1-Nq - +(N-1 )q •

Ll q LJ dq dq 1-q (1-q)11
k=1 k=1

The second sum is equal to (JVqN-1)/(1 - q), whence


M[Y] = 1-NqN -t+(N- 1) qN +NqN -i_NqN = 1-qN
p (1-q)ll p •

7.27. In Chapte r 1 we considered Buffon' s needle problem : a needle


of length l is thrown at random on a plane which is ruled with paralle l
straight lines L > l apart. The probab ility of the needle hittino- one of
the lines is p = 2li(Ln) . Since for l < L the numbe r of Itits can
only be 0 or 1, the mean value of the numbe r of hits is p. What will
be the mean value of hits if the restric tion l < L is remove d?
S?lution. We divide the length l of the needle into n elemen tary
sectwns M = lin < L. The random variabl e X is the numbe r of times
n
the needle hits the lines. It can be represe nted as a sum X = ~X h

where X 1 is the numbe r of interse ctions of the needle and the 1l;~s of
tlle ~th elemen tary section . The random variabl e X 1 is the indicat or of
the mtersection of the ith section and one of the lines and has a mean
12•
1.S2 Applied ProbC:e m.t (n ProbabUH y T h eorll

Solution. Stnce the contou r IS con\ ex, each element llx of the proJec-
tion results from the pro JCCtton of t" o and only t \V'O oppos1 te elements
of the contou r &l1 and Jll 2 (see Ftg 7 32), hence the average length
of the contou r proJect ton ts half the sum of the average lengths of the
proJeC tions of the elementar} section s tll tnto '' htch the contour can
be dt\ 1ded
1\I(XJ~.!.._" 2~1 =-1
2~ 11 .-y•

7 33. There IS a random 'ar1abl e X \Vlth probab 1l1t) density I (:r)


F1nd the mean '\alue and \ar1~nce -of t.he random var1ab le r· = \X~
Solutio n The not 9 t 1on Y -;;:::= l X 1 means tb~t

-
1,. _ {-X
.\
for X< 0
for X >0
~ 0 ~

m11 = l\li r ! = ~ 1:r 11 (z) dx = - 1rf (x} dx Jxf (x) d:r.


-~ 0
-~
(IQ

-= \ x(f(x) +I ( -z) dxt


0
00

Var11 =a 2 [Y]-m~= J j.:tf2J(x)dx-m;


= a 2 fXJ -m; = 'rarz + mi -m~
7 34 Ftnd the Jnean 'alue and varianc e of the tnodulu s of the random
variabl e X" "h1ch ts norma lly dtstrtb uted \\ 1th parame ters m~ C1
Solution~~ It follo\\S from the preced1 ng problem that
~
m 11 = -
~

a lr 2n. -.oo
0
_ \ ze- 2cr• dx +
(~-m)
f
a l~'" 2"'t
J iZ-m''
xe- 2°2 d:r
0

Chang1 ng the var1abl es (x - m)!a = t.., \\e get


--- tl

s (tu+m) e -2 dt+ .r~'t J (ta+m.) e -T dt


0 t• 4XI

m.==- v!n:
--oo m
-~ ....
0'
1 { m )2
= i2a
1 2r.t
e -2 a +mlll ( m. )
0 '

where ct> ts the error functva n


Vary~ o 2 +m 2 - m;
In parttcu lar, for rn = 0,

1
m11=V-./ :c:
2
cr~080cr ' 11
2
V9r =a2- ~ a2=1
\
1- 2
n
)o2~036a1
Ch. 7. Cha ract erist ics of Fun ctio ns of Ran dom Vari able s 183

7.35 *. Ind epe nde nt ran dom vari able s X and Y hav e. pro bab ility
densities / 1 (x) and f 2 (y). Fin d the mea n valu e and van anc e of the
modulus of thei r diff eren ce Z = I X - Y 1. g

00

mz= ~~ lx- ylf dx )f 2 (y)d xdy . 1!


-oo

A stra ight line y = x divi des the x, y-p lane


into two dom ains I and II (Fig . 7 .35).
In dom ain I x > y, I x - y I = x - y.
In dom ain II y > x, Ix - y I= y - x.
Hence Fig. 7.35

Tnz = ~ ~ (x- y)f i(x )f 2 (y) dxd y+ JJ(y- x)f dx )f 2 (y)d xdy
(!} (II)
oo X oo oo

= .\' xfd x){ J / (y) dy} dx- J Y/


2
-oo
2 (Y){J
y
fdx )dx }dy
-oo -oc
y 00 00
00

+ J Y/ 2 (y){ .\' ft(. x)d x}d y- .\ xft (x) {J / 2 (y)d y}d x.


-oc -oc x
-oo

We intr odu ce the dist ribu tion fun ctio ns


X

F 1 (x) = ) / 1 (x) dx,



-oo -co

The n we hav e
00 00
P r
Tnz = .\ x/1 (x) F 2 (x) dx - j y/2 (y) [1- Ft( y)] dy
-oc -oc
oc 00

+) Yf'2 (y) Fdy )dy - Jxft (x) [1- F (x)]dx. 2


-oo -oo

Co~nbining the first inte gral wit h the fou rth and the seco nd wit h the
thtr d, we obt ain
00 00

lllz = ~· [2x f!(x )F 2 (x) -xf t(x )]d x+ J [2yf 2


(y)F 1 (y) -yf (y)] dy
2
-ex: -co
oc oc

=2 .\ xft( x)F2 (x) d:r -mx +2 .\' yf2 (y) Fd y)d y-m y.
-oo
182 ApplH~d Probltmt ln Ptobabillty Thtorf!

Solution Stnce the contour element Ax of the pto~e


LS con' e-.::, ~ac.h
t1on results fron1 the proJeCtion of t'\\ o and onl\"' t"' o oppostte elements
ot the contour L\l 1 and ~l 2 (see F1g 7 32), henee the a"era~ length
of the contour pro jectto n 1s half the sum of the average lengths of the
pl'O)CCtlons of the: e1ernentar) S~Cll.Ons al lOto '' hl.th the contour can.
be dt' tded
1\f{Xl =t ,3 2;1 = ~ .
7.. 33. There IS a random \ 8 r 1a ble X \VI t h probab1lttl- dens1 t) I (t)
f1nd the mean 'alue and 'ar1ance of the random 'ar1able Y = l X [
Solution The nota t 10 n l,. = r X I means that
II

r = r-~ ~~~ i;g


~ tl
M I Y I= ~ I xI f(x) dx=- i xf(x)dx j rf(x) dx
QO

mll=
-~ -~ 0
oc

::::: ...~ x lf (x) +I ( -.x) dr


0
.ao

'arll =a: ll'J -m~ = Jr


-00
.:t r~ f (x) dx- m;
=a 2 {XJ- m~ = \"ar~+ mi-m~
7 34 FJ nd the mean ' al ue and '"ar1 a nee of the modulus of the random
var1ab1e X, '\\ h1ch ts normally dt.StTihuted v.1th parameters mt o
Solution It folio" s from the precedxng problem that
0 (%-m) o, (.:t.,.. ml•
mil:;;;;:;:: - f r .xe- 2o a. d.r + f f :XC- 2<rl dX
a 1 1 2~ \ cr l/2"1 J
-~ 0
Cbang1ng the '\ar1ables (x- m)/o = tt "e get
.., ,.
1(ta+m)e-T dt
--
m
(1'"

I ( m )z
= 2a e -T 7 +m<IJ ( m )
l' 2.~ <f •

v,here cil IS the error func..tian...


Vary = o 2 ...L. m2 - m;
In parttcular 1 for m = 0,
2
m, = ]/ ~- o::::::: 0 SOcr, Var, = o 2 - a2 a2= ( 1 - 2 ) a2 ~ 0 36o2
~
Ch. 7. Characteristics of Functions of Ran-dom Variables 185

+ v 0 [ 0.5-<D (

+ V2n1
exp
[
-. ~
t2 ]l J '
where t 0 =
v0 -m
crv
v ,

v0 oo
a~dZl = ' vzf (v) dv + ~ v~f (v) dv = (m~ +a~) {<D (t 0 ) + 0.5)

-oo 'VO

+vo[0.5-<D(to)l-
2a vmv +a; to
1f2n exp
[
-- 2-
J
til

Varz = cx 2 [ZJ -m~ =a~ { (1 + t~) (<D {to)+ 0.5)


+ V~n exp [ - t~ J- [to (<D (t 0) + 0.5)
+ V2n
1 exp [ - ~ Jljz}.
Note that if v0 =mv and t 0 =0, then lnz=lnv-av ("V2:rt)- 1 ,
Varz ~=a~ (n-1) (2:rtt 1 •
7 .38. N messages are being sent over a communication channel.
The lengths of the messages are accidental, have the same mean value m
and variance Var, and are independent. Find the mean value and vari-
ance of the total time T during which all N messages will be trans-
mitted. Find T rna~, which is the maximum practically possible time
during which the messages can be transmitted.
n
Solution. T = ~ T i• where T i is the duration of the ith message
i=1
(i = 1, 2, ... , N). By the addition theorem for expectations we have
N N
l\I[T]=l\·I[Lj Tt
i=1
]= Lj
i=1
M(Ti)=Nm.

By the addition theorem for variances, we have


N N
Var [TJ = Var [ 2j Tt] = Lj Var [Td = N Var,
i=l i=l

at= lfVar [T] = VN Var.


B~:the three-sigma rule Tmax = Nnl-+31fNVar.
'.39. Soh·e the preceding problem for the case when the lengths T t
of the messages are dependent and the correlation coefficient of the
random variables Ti and Ti is rii.
Solu~ion. The mean value is l\I [T] = Nm as before. To calculate
the vanance, we find the covariance of the variables T.z and T J•.· c0 ,, ij --
i84 Appl1ed Pro~ltms In Proba!Jlllty Theory

Since X and Y are Indepe ndent, It follows that


a, [ZJ = l\1 rJ X- y P~J = },1 ((X- Y) 21==:~I rX2j + 1\I [Y:J- 2l\I [X] ~If} J
= a: 2 {XJ ,-a 2 {YJ- 2mxmJI = Var%+ Varv+ (m-.:-m ,)%
From tbts \\e find that Vnrz = rt 2 lZl-m !
7"36. T\\o Indepe ndent random var1ab les X and Y have probah1hty
dens1t1es {1 (x) and fz (y) Ftnd the mean 'alue and var1ance of the
mtn1mum value of two vartab lrs, 1 e Z == mtn (X. Y)
Solution.
Z = {X"
Y,
If
If
X~ Y~
X >Y
domau1 s (see F1g
A straigh t l1ne y = .r d1' tdes the x, y plane Into t\\0
7 35) I, 'vhere Z = Y, and 1I, 'vhere Z = X (the case X Y Is not =
conside red s1nce It has a zero probab llity).

mz=;\I [Z]= Jj xfdx)f (y)dxdy+ JJyf.(:r)f!(y)dxdy


2
(l}
(lll
.ora QCI

= J xj 1 (X)\1 -F 2 (x))dx + J yj 2 (y)I1- F.(y)] dy,

where F 1 and F2 are the distrib Ution functto ns of the random 'artable s
X and Y
~ ~

o:2[Z) -=) zlfdx)il-F~(x)]dx+ j y 2 j 2 (y)[l- Fdy)d yJ,

Varr = a. 2 [ZJ- 1n:


7.37 A random voltage V has a normal d1st.r1b utton f (v) Vt 1th pa..
ramete rs m n and o v The voltage l 1 arr1ves at a l•mtter '' htch leaves 1t
equal to V If V ~ v 0 and makes 1 t equal to v 0 1f V > v 0
V lor V~v0 ,
Z=mtn {V, v 0 }~
l" > v~"' v0 for
F1nd the mean value and varianc e of the random variab le z~ \\hlch IS
the voltage taken oft by the ]1m1ter
Sol ut1on Procee dtng from Proble m 7 24 've h a ''e
1;10 'Cl'(] ~

rnz=l\ l{Zl= ~ mm {v, tl 0 }f{v} dv= ) vf(v)d v+ ) v11f(v)dv


-OQ -oo
Ch. 7. Characteristics of Functions of Ranaom Variables 185

+v 0 [ 0.5-cD (
Vo-mv
w h ere t 0 = ,
CJv
vo
.l
00

cx 2 [Z] = v 2 f (v) dv + Jv~f (v) dv = (m~ +a~) (cD (t 0 ) + 0.5)


-oo

+vo[0.5-cD(to)l-
2avmv +a;,to
V2n exp
. [

J
t5
2.

Varz = ct 2 [Z] -m~ =a; { (1 + tg) (cD (t 0) + 0.5)


+ /~n exp [ - t~ J- [to (cD (t 0) +0.5)

+ V 2n 1 .
exp
[
-
J
t5 1 1
2 J J.
2

Note that if v0 =mv and t 0 =0, then Tnz=mv-avCV2n)- 1 ,


Varz o= a~ (n -1) (2nt 1 •
7.38. N messages are being sent over a communication channel.
The lengths of the messages are accidental, have the same mean value m
and variance Var, and are independent. Find the mean value and vari-
ance of the total time T during which all N messages will be trans-
mitted. Find T nm.. , which is the maximum practically possible time
during which the messages can be transmitted.
n
Solution. T = ~ Th where Ti is the duration of the ith message
i=1
(i = 1, 2, ... , N). By the addition theorem for expectations we have

i\1 [T] = M [ Lj
N
T1
J= N
Lj M (Ti) =Nm.
i=1 i=1

By the addition theorem for variances, we have


N N
Var [T] = Var [
i=1
Lj Ti J= Lj Var [Td = N Var,
i=1

at= 1/Var [T] = V N Var.


B): the three-sigma rule T max = Nm + 3 VNVar.
'.39. Solve the preceding problem for the case when the lengths T 1
of the messages are dependent and the correlation coefficient of the
random variables T i and T 1 is rii.
Solution. The mean value is l\1 [T] = Nm. as before. To calculate
the variance, \Ve find the covariance of the variables T i and T 1 : Cov iJ =
186 Applied Probltmr fn Probability Theory

r~, 0 2
1 = r 11 Var B) formula (7 0 28) for the var1ance of the sum
Var IT]= A \tar+ 2 ~ r11 Var= Var (N
i<J
+ 2 1<i
2J r11)
T rna:~= 1\ m + 3 lf\'ar}T]
7 40 A s:tstem of r~ndoo1 'artables (X, ll tS uniformly dtstrihuted
\n a te~t~ng1~ l{ {F~g 7 4.0' Ftnd (t) '!llX Yl (2) ~llX - 1l +
!J

0 2 :r

(3) 1\ I l X l J (4) V a r [.\ -r- 1 1 (5) \ra r [X - YJ (6) l\ l r(X - 1')11


.and {7} 1\f [26¥ 3 + 31~ + J 1
Ans" er (1) J/2 (2) 1/2, (3) 1/2 (4) 5/12, (5) S/12, (6) 2/3 and (7) 6
7 41 Random faults occur ''hen an electronic. dev1ce IS operattng
The a' erage number of faults occurring per unit operatton time 1S A
and the number of fattlts occurrtng durtng the operation t1me "1' of the
de'\ 1ce 1s a random "\ c.r1able hav1ng a Po1sson d1str1hut1on Wlth para
meter a = At To cle1f a fanlt (to repatt the devH~e) a random ttme TreP
1s required '' h1ch has tln e'tponenttal dtstribution f (t) = J.lP ~ou fort> 0
The t1mes requ1red to clear separate faults are tndependent F1nd (1)
the a' erage fract lon oi t1me d ur1ng \\ h1ch the de'ltce '' 1ll operate fault
fre-e and the a\ erage fractuJn of ltme the repalr~ ,, tll last and (2) the
a' er3ge ttme Inter' al bet\\ een t'' o faults
S\)\U\\\)n \1) '!h~ a\ ~'ia~~ \.~mY: CJt ian\\ ir-ee ~ nTm"&.nC~ ui \h~ ~~.... \.~
(the mean \alue of the ttme from the moment the devtce 1s put tnto
operation until the moment tt IS S_.!opped for repatrs) 7sound = t/~ The
.a"\ erage ttme needed for repatrs trep = 1/~t The average fraction of
t1me during ''h1ch the de, 1ce \\ 1ll operate fault free
a~ t so rut == i /l - -::----:J-l-
t$ound +treP 1/A.+ 1/fl "-+ f1
Stmtlarl} the average fraction of ttme during "h1ch the de\ 1ce "tll
be repaired
~ = 1 - a. = ~/{k + p,)
(2) The a"erage 1nter' al of t1me /~ bet'' een t\\ o successt' e faults
-f = -
t fsound
-
+ tr~P =
1/A, 1/~L = (/w +
p.)/O~~t) +
7 42 A random po1nt {X 1''") has a normal dls\tthut1on on a plane
'' tth Circular ~cattert:og 1 e m:c = m 11 ~ 0 and ax = a., = a A ran
Ch. 7. Chara cterist ics of Funct ions of Rando m Variab les 187

dom varia ble R is the distan ce from the point (X, Y) to the centr e of
scattering. Find the mean value and varia nce of the varia ble R.
Solution.
R=V X2+ Y2.
oc

mr= M [R] =) JV x2+y2 2a\n exp ( --


x2+y2 )
Zcr 2 dx dy.
-oc

Using polar coord inates r and cp, we get


oc 2rr

mr = Jr 2
2: 2 rc exp ( - ;;2 ) dr Jdcp = cr 1/ ~ ~ 1.25cr.
0 0
oc

Varr=Var[R]=a 2 [R]-m~ = JJ 1
(x2 +y2)-:2,-cr·::-
2 2
!
2rc- exp ( _x 2 2Y ) dx
-oo

00

dy- m~ =
rl rs 1 (
2a2n exp -
r2
zcrz

0
00

= 2cr 2 Jte- dt -
1
cr 2 n =cr2 4-n •
2 2
0

7.43. Prove that if a rando m varia ble X has a binom ial distri butio n
with param eters n and p, then its mean value M [X] = np, and its
variance Var [X] = npq (q = 1 - p).
Solution, X is the numb er of occurrences of an event A inn inde-
n
pendent trials . It can be repre sente d in the form X= 2J X1, where
i=l
Xt is the indic ator of the event A in the ith trial, i.e.
if the event occurs on the ith trial;
if the event does not occur on the ith trial,
M [Xd = p, Var [Xd = pq,
n n
l\1 [X]= 2j M [Xtl = np,
i=i
Var[ X]= 2J
i=i
pq=n pq.

_7.44. Prove that for n indep enden t trials , in which an event A occurs
Wlth proba bilitie s p 1 , p 2 , • • • , Pn• the mean value and varia nce of
the number X of occurrences of the event are
n n
M[X] = Lj Pt; Var [X]= 2j Piqh where qi = 1- Pi·
i=l i=l
188 Applfed Probl~m• In Probabllltlf Theory

'fl

Sotut10ll. By analogy W\tl\ tl1e pr~edlng }lroblem X= ~ xh


i~t

whe-re X 1 1s the xndica tor of tl1c event A 1n the 'th tr1al


n n ..,.. n

~lfXJ = 2j ?\I(X,] =
t-1
2J
•-1
Pt, Var [XJ = ~ Var 1Xt1 ~ p,ql
1-=-1 t--1

7 45 .. ,,,.e co nstder n tndependent tru.lls 1n each of 'vhtch an e\ ent A


may occur \VIth probabdtty p F1nd the mean value and variance of the
frequency 1,. of the event A Ftn d the range of prac:ttcally possible values
of the frequency
Solut1on. }'" = Xln where X 1s the number of occurrences of the
e\ent A
1
l\lfYJ=='l(X]/n= n np=p, Var[YJ=npq/n 2 =pq/n,
au= VVar l Y ( = V pqln~
where q = 1 - p The range of ptactlcally poss1ble values oi Y J.S
m ± 3 cr 11 = p ± 3lfpq/n
'7 46~ Pro'e that the mean value and vartance of a random var1able Y.
whtch has a hypergeometr1c dJstrlbUtion (see 4 0 34) \\ 1th parameters a,
b, n are

l\I [X] = na V [X) _ nab


a+b " ar - {a+b)<i

a~b a~-;:_:1- {a~b


2
+n(n-i>[ ) ] (7 46 1)

respect1 vely
Soluhon Let us cons1der a phystcal model of a situation u1 which
a hypergeometr1c dts tr1bu t 10n occurs, say dra,ving n balls from an
urn "'htch contatns a 'vh1te nnd b black balls, X IS the number of the
Vr h 1te selected halls \Ve can represent n se Iect tons of a hall as n trtal3
tn each of" h•ch an event A ==== {a ''h1te ball} may occur \Ve represent
the variable X ns the sum of the , ar1ables X, ,vh1cb are the Indicators

of the event A 10 the 1tb tr1a 1t X = ,h X f The random var I abies X 1


i--t
are mutually de pendent t but the a dd1 tton theorem for ex pecta t1ons !S
appltcahl e '1z

M iXl - 2} M IXtJ, l\1 [X,]= a~b ,


{=:. t
Ch. 7. Characteristics of Punetions of Random Variables 189

The variance of the sum of the random variables Xi can be found from
formula (7.0.28):

(7.46.2)

n
'V nab
LJ Var [Xd = (a+b) 2 •
i=1
Let us find Covx x 1 = M [XiX 1 ] - M [X;) M [X 1]. The product of
1
the indicators X iX 1 of the event A in the ith and the jth trial is equal
to unity when Xi = 1 and X 1 = 1, i.e. the event A occurs both on the
ith and on the jth trial, the probability being equal to a .;. b X
(a-1)
(a+b-i); the expectation is equaI to t h e same va Iue. H ence
a (a-1)
M [XiXJ] = (a+b) (a+b-1) •

Hence

(7 .46.3)

The number of terms in the sum ~ Cov xixJ is c; = n (n - 1)/2. Sub-


i<i
stituting (7 .46.3) into (7 .46.2), we get formula (7 .46.1).
7.4.7. There are five white and seven black balls in an urn. Six balls
are drawn at a time. A random variable X is the number of black balls
drawn. Find the mean value and variance of the variable X.
Answer. M [X] = 3.5 and Var [X] = 35/44 ~ 0. 795.
7.4.8. A radar installation scans a region of space where there are N
targets. During one surveillance cycle the ith target may be located with
probability pi (i = 1, ... , N) independently of the other targets.
During a period n cycles are realized. Find the mean value and variance
of the number of targets X which will be located.
N
Solution. X= .:;6Xi, where Xi is the indicator of an event Ai =
i=1
{the ith target is located}.
P (Ai) = 1-(1- Pt) 11 ,
N N
M[XJ=2.j [1-(1-pi)n]=N -}] (1-p 1)n,
i=1 i::1
II'
Var(X)= 2} [1-(1-p,) 11 ](1-pc) 11 ,
i=1
As the number of cycles n tends to 1nfi.n1ty
l1m 'f (XI= N nnd l1m Var lXJ == 0

1 e 1n the llmtt o.s n .. oo all the targets "1ll be located


7 49* \\ e take an arb1trar:y po1nt A InSide a ctrcle of radius a (F1g
1 49) and dra\v a chord BC through 1t at an arh1trary angle a to the
radtus wh1ch passes through the potnt A Find the mean length of \be-
chord
So1utton \\ e ta'he the po1nt A at random 1ns1de the e1rcle and thete-
fore tts polar coordtnates R and <ll are Independent and have dJstribU
ttons
/dr}=;,r for O<r<a, f1 (rp)=l~ for0<cp<2n
Since the mean value of the chord IS evidently 1ndependent of the an
gle <p we ta~e the p<nnt A on an arb1trary rad1us (at a d1stance R from

FJg 7 50

the centre) and dra'v the chord BC through 1t at a random angle e


to the radiUS It folio" s from the hypothes1S that the random' arlable e
has a pro bab1l1t y denstt y f 3 ({}} = 1/2~ for 0 < {t < 2n \\ e ex prec::s
the length D of the chord BC In terms of the random variables R and e
by drawl ng a per pend 1cular from the centre of the circle to the cnord
and deSignate Its length as II It IS evident that
D-=2lfa2.-H2 • H=Rstne
D= 2 l'a2 -R2stnze
The mean value of the random variable D can be found as
2"'f a

M (DJ == I df} I 2 V at-r2 smt


0 0
f} ;:
2~ dr

1t/2
= 8 ' aS (1.-cos.t {)) d{J- 1.lla. ~ 1 7Da
:rta~ 3 s n 3 it 3n
0

7 50* F1nd the mean value of the length of the chord BC (F1g 7 50)
drawn through a po1nt A 1ns1de the circle wh1eh 1s at the distance L
Ch. 7. Characteristics of Functions of Random Variables 191

from the centre of the circle of radius r, all the directions of the chord
being equiprobable.
Solution. The chord BC can be expressed in terms of the quantities L,
<D and r as follows:

BC = 2r V1 - 2
~2 sin 2 <D.

If we consider the length of the chord BC to be a random variable X,


then
n/2 n/2
mx= \
0
~ 2r V 1- ( ~ ) 2
sin2rpdrp = ~ J
0
2 2
lf1-k sin rpdrp,

where k = Llr. The integral obtained is a complete elliptic integral


E (k, n/2) with modulus k; its values can be found in handbooks. For
instance, fork = 1/2 the integral E (1/2, n/2) = 1.4675 and mx ~ 1.87r.
Since the complete elliptic integral E (k, n/2) is measured from
n/2 (for k = 0) to 1 (for k = 1 }, the mean length of the chord assumes
the values from 2r (for k = 0, i.e. for the point A at the centre of the
circle) to 4r/n (for L = r, i.e. for points A on the circle).
7.51 *. A device consists of n units, each of which may fail indepen-
dently of the others. The time of the failure-free operation of the ith
unit has an exponential distribution with parameter A. 1:
I 1 (t) = A.ie-"-it (t > 0).
A unit that fails is immediately replaced by a new one and is repaired.
The repairs of the ith unit last a random time distributed according to
the exponential law with parameter f-tt: cp 1 (t) = !L 1e-Jt 1t (t > 0). The
device operates for a time T.
Find (1) the mean value and variance of the number of units which
will be replaced, and (2) the mean value of the total time T which will
be taken by the repairs of the units that failed.
Solution. (1) We designate as Xi the number of units of the ith type
that fail during the time -r. This random variable has a Poisson dis-
tribution.

Its mean value mx.l = A. 1-r and variance Varx... l
= A,,-r.
..
We des-
Ignate as X the total number of units that failed during the time -r.
We have
n n n
X= 21
i=1
xi' mx = 2J
i=1
mxi = T 2J
i=i
At·
The variables X 1 being mutually independent, we have
n n
Varx = ~ Varx.1 = T ~ /... 1•
• 1
t= • 1
t=

(2) W~ designate as Ti the total time needed to repair all the units
o£ t-ype t that failed during the time -r. It is the sum of the times needed
(92 Appl~td Problem1 tn Probahllltfl Tlaeorv

to repa1r all the units S1nce the number of un1t.s ts X h \Ve ha' e
X
T 1 =Tiu+T,~ + + T~x,J = ~ T\l),
llz::l

where f:
lS a random vartable wh1ch has an exponential dlstrtbutton
With parameter !-'-f .and the quantities Tlu, Ti2 ), are diSJOint
Let us find the mean value of thn random var1able T l us1ng the 1n
tegral formula for the complete expectation \Ve assume that the random
vartab le Xi assumes a de fin 1te vnl ue m and then the mean \ al ue of the
vartable T 1
m m

M iTr I m} = ~ M t7'\kl1 == ~
ll~l 11.:=1
!, = ;,

\\ e multtply th1s cond1t1onal e"tpectation by the probabl11t-y Pm


that the random variable X l assumes the value m sum the products
and find the complete (absolute) ex: pee ta t1o n of the var table T ~

Ustng then the theorem for the addttton oE expectattans we obtatn

Note that the same result can be obtained by a less str1ct argument
The average number of failures of the untts of the tth type durrng the
t1me T ts At't' the average t~me taken to reparr one untt of that type JS
1/t-tt and the average t1me whtch 'vill be spent to repatr all the units
of type i that fatl dur1ng the ttme -r IS A1't'I!J., The average t1me that
1'\

will he spent to repair all the umts of all types Is 't ~ ,_,
I ill
i-1

7 52* The conditions of Problem 7 52 are changed so that each


un1t that fails 1s sent to a repa1rsbop and the devtce IS stopped for the
t1ma needed for the tepau~ When the dev1te does not operate other
Units cannot fa1l Ftnd (1) the mean value of the number of stops of the
,a
ti.ttl..V.A r.b~l!\.TJ...~ \,lJR, t.,.\.WR. --r.. (L\ ~m YJ&O.'O. ... 1lu-e. 4 +.,~ 1r4fi'l•'-~ '0~ ~ tJ-tne 't'
dur1ng \Vhlth the dev1ce w1ll be 1dle (It ts also the mean t1me spent on
repairs)
1Solution (1) 'Ve designate the number of stops during the tune 't'
as X and find tts mean value mx \Ve shall use the following not very
strtct (but true) arguments to solve the problem \Ve represent the
Ch. 7. Characteris tics of Functions of Random l'ariables 193

operating process of the device, infinite in time, as a sequence of "cycles"


(Fig. 7.52), each of whic-h consists of a period of operation of the system
(marked by a thick line) and a per~od of repair. The durat.ion of each
cycle is the sum of two random vanables: Toper (the duratwn of oper-
ation of the device) and Trep (the time taken to repair it). The average

\ 1\ I
0 v v t
tst cqcle 2nd cqcle 3 rd C.lfcle 4th ~!fcle
•'
Fig. 7.52

duration of the operation of the device mt oPer can be calculated as the


mean time between two consecutiv e failures in the flow of failures of
1l

intensity 'A = ~ A.;; it is


i=1

We seek the mean tirne of repairs mt rep . \Ve find it using the complete
expectation formula on the hypothese s Hi ={a unit of type i is being
repaired} (i = 1, 2, ... , n).
The probabilit y of each hypothesis is proportion al to the parameter
1~~ i:
11

P (H;) = t.i/2J
i=l
·;.,i = 'AJA.

The conditiona l expectatio n of the time taken by the repair on this


hypothesis is 1/~ti; hence
1l n
I. i 1 '), .
-- ~
m 1rep-........! /.u' I. '). ~
I

~l i
i=l i=l
The average time of fl. cycle
n n

i. + ~- ~ :~: . I
~- (1 + ~ ~;~ ) . i=l ' z
1=

Let us now represent the sequence of stops of the device as a sequence


of random points on the t-axis partitione d by intervals equal to m 1 •
The aYerage number of stops during the time T is equal to the aver~e~e
number of random points on the interval -c long; b

7n.1.=T/m,rep ='A-r:j ( 1+ ~ ~;:).


i=1
f94 Applf td Proble m• fn. Probt: btltty Theory

(2) Dur1 ng each cycle the devrce 'viii be 1dle (w1ll be repa1red)
"'
for t1me m,rep = ~
l'lr
~ ,..,
flJ
on the average; consequently, the average
f-1

idle t1me

7 ~53. A rando m varHt hle X Is norm ally dtstrt butcd '' tth character%~
tics m't' and a~ The rando m 'ar1a bles }~ and Z are related to A, as
Y = X and Z =
2
X 3
F1nd the co' ar1an ces Co' .ru' Co\ :r:r and Cov 1 ~:
Soluh on To stmpl 1fv the calcu lation s \Ve pass to the standar(hzed
var1a hles .r~nd u~e the fact that for a stand ard I zed norm al variab le
0

X = X- nl '( all the cent-r al moment~ o{ odd ordet s are equal to tero
Q 0

and ~I I X 2 1 - a;. r-.r (X"] = 3a~ (~ee Probl em 5 26) S1nce


., 0

Y ={X +m.x ) 2 -l\l [X2J= X 1 + 2Xm x+ m~- Varx -n1i


Q 0

0 Q

= X2 +2Xm:.:-o~,
it follow s that
0 0 0 Q 0

Cov:cu = }.l {XY} = '1 [X (X?..+ 2Xmx -o~}] = 2aimx


Furth ermo re
c 0

xa + 3X2mx + 3Xm~ + m~- (3mxo~ + M~)


0 0 0

z =(X + mx) 3
- 'J [XS] =
0
0

= xa + 3XZm~+ 3Xm i- 3m :co~


and there fore
0 C" 0 ~ 0

Covz: = !J! [XZJ = !J,J (X'J + 3m\.I\l [X3J + 3mi 1\1 [X 2 J


-3m ~oit\I [X J ~ 3a~ + 3nt~o~
0

1 and finall y
c

+ 2Xm x- o~) (X3 + 3 X2 nt~ + 3Xm~- 3n!x<l~;)J


0 0 C) 0

Cov11 = lvl
t {(X 2

= m~ lL\. 'l + m~ {m~- oi) i\1 [X J+ 3nz.ra~ = 12m.-c:o! +Gmxa.\


5 M ~ 6 () 2 31

7.,54 An obJec t '"hos e mass IS a g 1s ,,eJgh ed four tJmes us1ng an


analy tical balan cet the resul ts obtn1 ned are X 1 , X 2 , ~Y 3 , ~Y ~ Thetr
ar1th meltc mean ts tal\:en as the meas ured value of the mass Y =
(Xt +X ,+X3 +
X 4J/4 The resul ts of tho "e1gh1ng.s are mutu ally
Indep enden t The balan ce gtves a syste matiC error mx = +O 001 g
Ch. 7. Characteris tirs of Functions of Random Variables 19S

The mean square deviation of each weighing ax = 0.002 g. Find the


mean value and the mean square deviation of the random variable Y.
•'
Answer: m.y = a + 0.001 g; cry = ax/2 = 0.001 g.
7.55. Four independe nt measurem ents are made of the same vari-
able X. Each measurem ent is characteri zed by the same expectatio n
mx and the same mean square deviation cr x· The results of the measure-
ments are XI, x2, Xs, x4. 'Ve consider the differences between the
consecutive measurem ents: yl = x2- XI, y2 = Xs- x2, Ys =
X 4 - X 3 • Find the mean values my., my. and my.. the mean square
deviations cry,, cr Y•• (J Y• and the standardiz ed correlation matrix II rii II~
Solution •

, --
m"Y• - m Y2 -- 11'"Ya -- 0' a 2Yt -- cr Y•
2
-
- a 2
Y• -
- 2a 2x,
Ciy 1 = Gy 2 = cry. = Ci:c y2.
0 0 0 0

Since the variables X 1 , X 2 , X 3 , X 4 are mutually independe nt,


• 0 0 0 0 0

Covy 1y2 = l\1 [(X 2 - Xt) (X3 - X 2 )] ::= -1\I [ X;J = - cr~,
0 0 0 0 0

Covy 2 y3 = l\l [(X 3 - X 2 ) (X 4 - X 3 )] = - MJXiJ = -a~,


0 0 0 0

Covy 1 y3 = M [(X 2 - X 1) (X 4 - X3)J =0,

1 -1/2 0
llrull= 1 -1/2 .
1
7.56. A hole may occur with equal probabilit y in one ofthe six walls
and at an-:,r point on each wall of a cubical tank full of fuel. All the
fuel above the hole leaks out of the tank. When sound, the tank is kept
3/4 full. Find the average amount of fuel which will remain in the
tank after a hole appears .
. Solution. We shall assume for simplicity that an edge of the tank
JS equal to unity. We designate the height of the wall as X and the
amount of the remaining fuel as Y. Since the base area is equal to unity
nhave •
for X< 0.75,
for 0.75 <X< 1.
If the hole occurs higher than 0.75 of the ·way from the-bottom of the
tank (X > 0.75), then no fuel will leak out and the sa~:lle amount of
fuel, !"" = 0.75, will remain in the tank. The probabilit y of the latter
cabse IS equal to the fraction of the surface area of the tank which is
a O\'e the 0. 75 level:

P{Y = 0.75} = P{X > 0.75} = 1/6 + (4/6) X 0.25 = 113.


13•
iOG Apptu~d Problems tn ProbabllllY Th~Dr~

If n hole appear" tho botton1 of the tan~ (X = 0) 1 then aU the


10
fuel "Ill run out, Rnd tl1e probalnht' of thtc:; lS equal to the fraction of
the bottom nre.a o[ the lank
P{l' = 0} == P{X = 0) = 1/6
If a hole appears 1n one of the l\ ails of the tanl. . X < 0 75 d1sta.nt
from the bottorn then the amount Y = X of fuel \\Ill remain 10 the
tanl"" The probability dcns1ty tn the Interval (0. 0 75) IS con'}tant and
eq ua I to ( 1 - 1/3 - 1/6)/0 75 = 2/3 'fhe a vcrage nmoun t of fuel
remaining tn the tanl
0 75

m,=O 75x++O x-i-+ J x faz=O 44


0

7.57. A po1nt a 1s fi "\ed 111 the 1nter\ al (0 t) (Fig 7 57) A random


potnt X 1s untforml~ distributed In that Inter\ al Ftnd the coeffic1ent

~~1
X [
"V 't: v : ... :)> a 1
....
t
R
tlg 7 57

of correlat1on bet\\ een the rando1n 'artable X and the dtstance R from
the point a to X (the dtstanco R 1s al" ays assumed to be pos1ll\ e)
Find the 'alue of a for '' h1ch the \ aru1.bles X and R are uncorrelated
Solution '\'~ find Cov x~ from tho formula Cov ~r = 1\1 [AR]- m~m,.
1 I
l\1 rX R) = M rX I a -X II = J X I a - X I t (x) dx = j X I a- X I dx
0 0
a t
= ~ x(a-x) dx- ~
0 n
x (a-x} dx = a;- i t- { rrt,.,=f,
l a t

m,.= J )a-x)dx= i (a-x)dx- J(a-x)dx=a 11


-a+-j-
o 0 0
Hence

Var~= 1~, Ux=- ~·-, Var,.=a 2 [RJ-m;,


2.\ 3
I

tt:d llJ = J(a -x} dx= a 2 2


- a+ f,
0

Varl'=2a3-a•-a2 + 1~, a,==VVilr,.,


Ch. 7. Chara cterist ics of Funct ions of Rando m l'ariabl es 197

Hence
a3
r xr = ( 3 -
a
2
2
+ 1 )
12
j (-.V/ 2a -a -a + 121
3 4. 2

The equat ion a 3 /3 - a2 /2 +


'1/"12 = 0 has only one root in the
interv al (0, 1), viz. a = 1/2. There fore, the rando m varia bles X and
R will be uncor relate d only for a = '1/2.
7.58. A car can trave l along a highw ay with an arbit rary speed v
(0 :s:;; v ~ VmaJ . The highe r the speed of the car, the highe r the prob-
abilit y that it will be stopp ed by an inspe ctor. Each time it is detai ned
for an avera ge time t 0 • Inspe ctors are statio ned at rando m along the route .
The rando m numb er of traffi c cops per unit lengt h along the route has
a Poisson distri butio n with param eter 'A. There is a linea r relati onshi p
between the proba bility of being stopp ed and the speed of the car:
p (v) = kv (0 ~ v ~ Vmax) , where k = 1/vma x- Find a ration al speed
for the car vr at which it will cover the whole route s in a minimu~ time
on the avera ge. ·
Solut ion. The avera ge time of cover ing the .dista nce s· is '

t = s/v + 'Asp (v) t 0 = slv + /..skvt 0• •

If the minim um of this funct ion lies, in the inter val (0, Vmax) , th(m'
it can be found from the equat ion
at s
iJv ==- v!! + A.skt 0 = 0,
whence

This formu la is valid for vr < Vmax•i.e. for vmax > 1/('A.t0).
Thus, for instan ce, for Vmax = 100 km/h , 'A= 1/20 km- 1 and t 0 =
20 min

vr = j/ woj (2~ x-}) ~ 77.5 km/h.


If Vma:-. < 1/(l,t 0 ), then the minim um of the funct ion t = s!v ..L
l:sp (~) f 0 lies outsi de of the interv al (0, Vmax) , and the speed vr ~
l·ma,. ~s the most a~vantageous. For exam ple, if for the data given above
t.he lime the car Is delay ed by a cop decre ases to 10 min, then vr =
lma;.. = 100 km/h .
7.59. A numb er of indep enden t trials are made in each of which an
event A may occur with proba bility p. The trials conti nue until the
nent ~1 _occurs·k times , and then they are termi nated . A rando m vari-
~ble X ls ~he numb er of trials which must be made . Find its mean
'nluc . \'anan ce and mean squar e devia tion.
Solut ion. \Ye repre sent the rando m Yaria ble X as a sum
k
1
.t\. -2 T'
~ -LX
"-X j • •• T ·Xh = '\1
L..J X it
i=i
t.98 A. pplt.td Prob ltms (n ProbabfUty Th-eory

where X 1 ts the number of tr1als made t1ll the first occurrence of the
event A. X 2 IS the number of tr1als made from the first to the second
occurrence of the e' ent A, ., X k t.s the number of trtals from the
{k- 1)th to the kth occurrence of the event A
Each random var1ablo X 1 has a geometriC d1strtbutton beginning
"1th un1ty [see (4 0 32)1., and 1ts charactertSttcs
-a[ [1 11 = 1/p. Var [X 11 = qlp 2 (q = 1 - p)
Us1ng the theorems fo-r the add1t1on of e"'tpectat1ons and var1ances,
we obtain
A ~

1\l IX}= }J !\II X1} = kp,


t-1
'"ar IX}==~
i-1
'"ar tX 1] = l..ql p~,

az= l/\'ar [XJ = lfkq/p.


7 60 To assemble a reliable dev1cet 1., h1gh qualtty homogeneous
parts are needed Each part 1s subJected to var1ous tests and (1ndepend
ently of the other parts) may be found to be of h1gh qual1ty \\ tth prob
abtlit} p \Vhen ~ htgh qualtty parts are selected, the tests are ter
m1nated The supply of parts ts pract1calll unlimited Ftnd the mean
value mx. and vartance Var:r: of the random vartable X (the number of
parts that have been tee;; ted) F1nd the max-
Imum practically posstble number of parts
XI x2 whtch \Vtll need to be tested~ Xmlx
'1)JV) Answer From the solut1on of the pre
ced1ng problem we have
X,>X2
mx = kp, Var.:t ==:==:: kql p 1 , cr~ = Vkqfp and
Xmax == kp + 31/kq/p.
1 61 According to a networl of manage-
ment planntng the moment Y an operation
begtns IS the ma"ttmum t1m.e two support
Fig 7 61
1ng opera t 1ons X 1 and X 2 are f1n1shed Th~
random vartables X 1 and X 1 are mutually
tndependen t and ha' e prob ab 1l1t y dens1 t1es f 1 (x1) and f 2 (x 2 ) Ftnd
the mean value and var1ance of the :random var1ahle Y, as \vell as the
range of 1t.s pract 1call} pos.s 1bl e "al ues
Solnllon

for X1 < X2 -
The dnma1n I, '' h~re Y 1 > "\. 2 , and the domrnn II, "\\here X 1 < R"sl
are sho" n tn FJ~0' 7 61
QC., :C:l
m11 = M l1' 1= } xdtlxt) { j } (x 2 2) dx2 } d.t:t
Ch. 7. Characteristics of Functions of Random Variable.~ 199
'

co X2

+I X2/2 (xz) { ) /i (xi) dxi} dx2


-co -co
co 00

= .\ xddxi) F 2 (xi) dxi + ~ xd2 (x 2) Ft(x2 ) dx2 ,


-co -co

where F 1 (x) and F 2 (x) are the distribution functions of the random
variables xl and x2
respectively.
00 co

M [Y 2 ) = \ xifi (xi) F 2 (xi) dxi + Jf x;f 2 (x 2) Fi (x 2) dx 2 ,


·'
-oo -oo

Var [Y] = M [Y 2 ] - m~, O"x = VVar [Y].

The range of the practically possible values of Y is my ± 3cry.


7.62. Under the conditions of the preceding problem, the n support-
ing operations must be finished before the planned operation is begun
(moment Y). The moments X 1 , X 2 , • • • , Xn when they are finished are
mutually independent and have probability densities / 1 (x1 ), f 2 (x 2 ),
· .. , In (xn). Answer the same questions.
Solution.
Y=max{X 1 ,X 2 , ••• ,Xn}=Xi for Xi>Xi (i=1,2, ... , n,i=l=j),
00 Xt Xl

my= M [Y] = J xd 1 (x 1) { ) (n --1) J / (x


2 2) / 3 (x 3 )
-co -oc -oo

oo X2 X2

+ .\ -co
x2l2(x2) {) (n-1)
-co
~
-co
ft(xi)la(xa)

... In (xn) dx 1dx3 ••• dxn} dx 2



co xn xn

+ .. · + 1Xnfn (xn) { ~ (n-1) J fdxt)/2 (x2)


-oo -oo
• .. fn-dXn-i) dx 1 dx 2
""
•.. dxn-i} dxn.

In this case the (n - 1)-tuple integral decomposes into the product


of n - 1 simple integrals, and, therefore,

i=l -CX>
200 A r Dlt~d Pr ob le m ' in Pr ob ab rl ity Th eo ry

"' he re r l (x .) IS th e dt st rt bu tt on fu nc
ti on of th e ra nd om 'a rl ab le XI
(J = 1, 2,. , n) of the ar gu rn cn t z~ By analogy " e find that
;\l{Y'-J = _L, ) .r V ,( x1 ) fi E1 (x 1) dx ,; Var ll'"l = \l {Y2 ] -m;
1= ! - t)O l ~J~n
1~J

If th e ra nd om va ru 1b le s X (J = 1, 2,.
1 ., n) ha \C the snme d1strtb-
ut 1o n 'v 1t h den~Ht} f (x) an d dt st rl bu
tt on fu nc ti on F (x ), th en

m 11 - n ~ .r f (x ){ f (xw-• dx .
lXI

~~ [ Y 2 J an d \ ar f} ] ca n be ca lc ul at ed b} an al og y
oc

;\ I( Y 2 ]= n J.r"/(x){F(.r)}- d.r
-o o
1
an d V ar {Y J= [Y 2 J- m ;

7 ~63~ A de \ 1ce (d es tg ne d as a ma>..tmum


la rg er of t'" o vo lt ag es V an d lr T 'o l tm et er ) reg1ste-rs the
1 ho ra nd om 'a rJ ab le s l' an d V,
ar e m ut ua ll y In de pe nd en t an d ha2ve
th e sa m e den~nty j (v) 1F1nd the
m ea n \a lu c of th e vo lt m et er read1ng
1 e of th e ra nd om va rt ab le
m ax {V1 , V :} , 1f f (v) = J..e-tv (v > 0) IS l' =
"1 th pa ra m et er A an ex ponenl1=:1l dJstribUtton
So lu ti on . In ac co rd a ne e '' 1t h th e so Iu
tt on of th e pr ec ed tn g pr ob le m
oc

J vf(v)F{v)dv=2 JAve-A"(1-e-:a.")dv
QO

l\ I[ V J = 2
-o o 0
00

J'J...ve-J.." d v - 2J.. ~ ve-


co

- 2 2'J.." c!v
0 0
2 t 3
=y- 2A = 2). •
It ca n he se en th at nl IV] 1s 1 5 ti m es as la rg e as th e m ea n va lu
et th er l l or v2 es of
1. 64 The mean t-alue an d the varlance of
the sum of a random number
y
of random terms A ra nd om 'a rt ab le Z IS
a su m Z = ~ X h "h er e the
ra nd om 'a r1 ab le s X 1 ar e In de pe nd en t i- J
an d ha vo tl1c sa m e dt st rt bu tl on
\\ tt h m ea n va lu e mx an d vo.r1ance V nr
:u th e nu m be r of te rm s Y I~ an
In te gr al ra nd om va rt ab le '\h rc h do es no t
de pe nd on th e te rm s X n has
a me1n va lu e my an d a \a r1 an ce V ar y F tn
of th e ra nd om va ri ab le Z d th e m ea n 'a lu e an d 'a rt an ce
Ch. 7. Characteristics of Functions of Random Variables 201

Solution. Assume that the discrete random _variable Y has an ordered



senes
1j2j ... ,k, ...
Y:
PI I P2 I ... I Pk I ... .
We make a hypothesis {Y = k}. On this hypothesis
h
l\1 [Z I Y = k] = Lj M [Xd = kmx.
i=1
By the complete expectation formula
M [Z] = ~ kphmx = mx ~ kpk = mxM [Y] = m ...:my.
k k
Similarly, the conditional second moment about the origin of the-
variable Z

M[Z21Y=lcJ=l\1[(~ 1 2] [
xi) =nl ~ J
1 X1+2f]jxixj
k k

k
= i=i
lj a 2 (XI+ 2lj mxmx ;=: ka 2 [X]+ k (k-1) m~
i<j
= k [Varx+m~] +k (k-1) m~= kVarx+k 2 m~.
By the complete expectation formula
a 2 [Z[ = Varx 2J kph
k
+ m~ ~ kk
2
pk = Var.>:my + mia 2 [Y]

= Varxmy +m~ (Vary+ m~) = Varxmv+m~Vary +m~m~,


Var [Z] =a2 [Z]-m~m~ = Varxmu+ m~Vary.
Thus we have
lnz = Varz = Varx my
mxmy, m~ Vary. +
If the random \"ariable Y has a Poisson distribution with parameter a,.
then
lnz = amx,
Varz = a (Varx mi). +
7.65. A message is sent over a communication channel in a binary
code consisting of n symbols 0 or '1. Each is equally probable and in-
dependent. Find the mean value and variance of the number X of changes
in symbols in the message as well as the maximum practically possible-
number of changes.
Solution. Let us consider the n - 1 changes from one symbol to the
next to be n - 1 independent trials and consider, for each of them,
a variable X i• which is the indicator of a clwnge of symbol. This value
is equal to unity if a symbol is changed and to zero if there is no
change.
n-1 n-1 n-1
J\. ~ X i .. i\I[XJ =
"'' = --J 2J i\1 [X;]= 2J p= (n-1) p,
i=1 i=t i=1
202 Appll'd Problem~ tn Probability Theory

"bere p IS the probabtltty that a symbol 1s changed at a g1ven (lth)


Inter\ al Stmtlarly
n-1
VariX)-~ Var(X 1)==(n-1)p(1.-p)
i~l

1t 1s evident 1n thrs case that p = 1/2 and ?\II X) = (n- t)J2


Var [XJ = (n - 1)/4 By the three sigma rule Xma:.: = (n- i)/2 +
,r
3 n-- 1/2
7 66 A group of four radar un1ts scans a reg1on of space In wh1ch
there are three t arget.s S 1 , S 2 S 3 The region ts scanned for a t1me t
During that t1me each untt 1ndependentl~ of tbe others. rtlay detect
each target 'v1th probab1ltt, p 1 "h1ch depends on the ord1nal number
of the target and transm1ts 1ts cooi'd,nates to the central control stat1on
Ftnd the mean 'alue of the number of targets .A "bose coordinates
w1ll be registered at the stat1on
Solut1on \Ve des 1gna te as Xi the Indtca tor of the e\ en t A 1 ~
{the ltb target 1s detected}
1f the ~:th target ts detected
1f the tth target 1s not detected (t = 1, 2, 3)

The mean value of the random variable A, 1S 1\I [X,]= P {the ith
target ts detected} -= 1- ( 1- p 1)"'
a
Stnce X= 2J X 1 t 1 t folio\\ s that
t.~1

3 3
~~[XI-= lJ [1- (1- Pt)'l = 3-}J (1- Pi)~t
t-1 t-1

7 67 The cond1t1ons of the prec.ed1ng problem are changed so that


the proba hill t1es of detect 1ng a target by dt fferen t un1 ts d1Het the ]th
untt detects the ~th target "1th probabllit~ PtJ (t = 1., 2, 3 1 =
1, 2 3 4)
, 4
Ans\\er l\I [ \] = 3- 2J lf
1=1 J--1
(1- Ptj)
7 68 A radar tnstallat1on scans a region of space 1n \VhLch there are
four targets Dependtng on the ttme. of scann1ng the probabtltty of
det ec t1 ng a target 1s a function p (t) and does not depend on the deteet1on
of the other targets Ftnd (1) the mean value of the t1 me T 1 In \vbJch
at teast on~ target'\ 1\\ be detected and {2) t'he mean value of \he trme T•
1n \\ h1ch all the four targets \Vtll be detected
Sol u lion (1) The prob abtli ty that not a sIngle target \V Ill be detected
1n t 1me t JS [1 - p (t)J4. the probnblll t y that at least one target 'v1ll be
detected 1n that trme lS 1 - (1 - p (t)]" Thts 1s s1mply a dlstrtbutlon
funct1on F 1 (t) of a randum var1able T 1 As ,ve pro'\ed 1n Problem 5 28,
Ch. 7. Characteristics of Functions of Random Variables 203

for a nonnegative random variable T1


00 00

M[T1 ]=) [1-Ft(t)]dt= J['1-p(t)]!'dt.


0 0

(2} The probability that all the four targets will be detected in time t
is [p (t)l4. This is a distribution function F 4 (t) of the random variable
T4 • Its mean value is

1[1-Fdt)]dt= ~ {1-[p(t)] }dt.


00 00

4
M[Td=
0 0
For example, if the probability p (t) is defined by the formula p (t) =
1 - e-at, then
oc 00
,. 1 \'
J\1[T 1]= J [1-(1-e-al)]4dt-= j e-4at dt= rx,
4
0 0

:;rx.
00

M[Td= ~ [1-(1-e-cd) 4 ] dt=


0

In this example the mean time of detecting all the four targets is
€ight times as long as the mean time of detecting at least one target.
7.69. \Ve make n trials in each of which an event A may or may not
<Jccur. A random variable X is defined as the number of occurrences of
the event A in the whole series of trials. Find the mean value of the
variable X.
Solution. We represent the variable X as the sum of nrandom variable
Xi (i = 1, ... , n), where Xi is the indicator of the event A in the
ith trial:
1 if the event A occurs on the ith trial,
0 if the event A does not occur on the ith trial.
n n
X= 2J Xi; ivi [X]= 2J 1\-1 [Xd,
i=1 i=1

111 !X d = Pil where Pi is the probability of occurrence of the event


Qll the ith trial. Thus we have
n
11 [X]= 2J Pi· (7.69.1)
1=1

ln particular, if p 1 = p 2 = ... = p, then


1\1 IXI = np. (7.69.2)
It sl~ould be emphasized that for formulas (7.69.1) and (7.69.2) to be
~pphcable, the trials need not necessarily be independent.
204 Ar;pl1ed P o~lemJ fn Prohab,ldy Theory •

7. 70 . Cons1dertng tl1c trtnls to bo Independent, under the c?ndttwn


of Problem 7 6Ut f1nd the var1 1ncc of the random varHlh1e X
Solution. By the theoren1 for the 1.ddtt1on of 'ar1ances.
n. 1\.

Var I .A J= \'ar (Lj X 1J = ij Var IX!l·


l~l l~t

Tbe vartanco of the 1nd Jca tor of the C\ cnt .rl tn the ~th trtnl {se~ formulJ
(4 0 lU)] IS Pr (1 -- pl)~ then
n
\ ar [XJ :==- 2J
1=-1
p~ (1- PJ)· (7 70 i)

In th~ p1rt1cular ca~e '\hen p 1 P2 ~ ;:::;;:- = Pn = Pt


Var lLYI - np (1 --- p) (7 70 2}
It ~hould be emphasized that formulas (7 70 1) and (7 70 2) can be-
used on] y for Independent tr1nls
7~ 71 \V·e make n 1ndependeut trials 1n each of \Vh1ch an e\ ent A
rna~ occur or not occur A ran do nt 'arta ble X ts de f1 ned .as the number
of occurrences of the e' ent /l F1nd 1ts variance
n.
Soluhon x~ L xl
l=t
\\here xi 15 the tndJcator of the c"ent A lD

the trtal By the general form n1il (7 0 28) for the var1ance of
1 th
random 'ar1ahles~ \\t) have
,, n

Va-rIX} ;;;:::: ~ \r ar IX t] .+. 2 ~ Covx1x1,


l=l 'Z<}

where Cov~ l x I lS the covar11nce of tho random 'ar1ables X 1 and x,


Ca\x 1x -== 'liXtXJ} ~ \1 [X 1] :\liXJ]
1
The vnr1able X ,X 1 tlJrns 1nto un1ty only 1i X 1 == 1 and X{ = 1 (1 e the
event A occurred both on the lth and on the J th trial) l\1 X 1 XJI = P~i'
\\here p, 1 IS the prohnlnli ty that tho e1 ent A occurred both on tbe
lth and on the Jth tr1al Co, ~·\.x = Ptt - p 1p 1, hence
1
n

Var {X}== LJ Pi (1-- P~) ~ 2 LJ (Pt)- PtPJ) (7 71)


1~1 1<J

Th·us 1 n order to fJ n d the \ ar ta nee of tho n u m her of occurrences of


an e\ en t In n dependent trials, 1t IS not sufftc1ent to kno'\\ the prohab1hty
P~ of the occurrence of the e' ent In each trtal. It 1s also necessar) to
J. . no\v the prob ab1lt t y p u of the s 1m u1 ta neo us occurre nee of the e\ ent
In each pa1r of trials ·
In parttcu]ar, \\hen p 1 === p 2 =- = Pn = p and "hen Ptl does
not depend on t or J and IS equal to P, formula (7 71) assumes the
form
Var IXJ = np (1 - p) + n (n - 1) (P - p 2),
Ch. 7. Charact eristics of Functio ns of Random l'ariable s 205

where p = Pii is the probab ility of the simulta neous occurre nce of the
€Vent in any pair of trials.
7. 72. There are a white and b black balls in an urn. ,;v e draw k balls
at random . A random variabl e X is the numbe r of white balls among
the drawn balls. Withou t resorti ng to t.he distrib ution of the random
variable X (hyperg eometr ic, see Chapte r 4), find the mean value and
variance of the random variabl e X.
k
Solutio n. X=~ Xi, where
i=1

xi= {6 if the ith drawn hall is white,


if it is black.
k
M [X]=~ M [Xd = kp,
i=1

where p is the probab ility of drawin g a white ball;


p = al(a + b), M {X] = ka!(a + b).
We seek the varianc e of the random variabl e X as the varianc e of
the number of occurre nces of the event in k depend ent trials [see
Problem 7.71]:
Var [X] = kp ( 1 - p) +k (k- 1 ) (P - p 2). (7.72)
Let us find P, which is the probab ility that some two drawn
balls (the ith and the jth) are white: P = a~b a~-;;~ 1 . Substi tuting
this express ion and p = a b into (7. 72), we obtain
a+
, kab ( a a-b a2 )
Var[X ]= (a-\-b)2 +k(k - 1) a+b a+b- 1- (a+W ·

7.73*. A train consist ing of m carriag es arrives at a railway shuntin g


yard. Out of these carriag es m 1 are bound for A 1 , m 2 are bound
h

for A 2 , ••• , m71 are bound for A 11 ( ~ m; = m). The carriag es are located
i=1
at random along the train and irrespe ctive of their destina tions. If
two adjacen t carriag es have the same destina tion, they should not be
U~Jcoupled; if their destina tions are differen t, then they are uncoup led.
Fmd the mean value and varianc e of the numbe r of uncoup lings that
must be made, and also estima te (using the three-s igma rule) the range
of the practic ally possibl e numbe r of uncoup lings.
Solutio n. Let us conside r the m - 1 couplin gs of the carriag es to be
m - 1 possibl e positio ns at each of which the carriag es may be un-
coupled or not. We designa te as X the numbe r of the uncoup linas and
the numher of "nonun couplin gs'' as Y; Y = m- 1 - X. It is "easier
206 A.pplltd I roh't m1 In Probabzllty Theo ry

to man ipul ate the rand om vart able Y. 'Ve repr esen t Jt as the sum Y =
m-1
~ Yh '\vhere
i~1

1f ther e rs no unco upli ng at the ith pos1tton,


r, = {6 If ther e IS an uncou pltn g

(the Jndt cato r of unco upli ng at the ~th posi tion ).


m-1
By the theo rem for the addI tion of e"(pectat 10n5, 1\I l Y] = LJ l1 rr~d'
~=1

1\[ I Y, 1= q1 '\her e q, JS the p1 obab tli l) that thor e 1s no uncoupling at


the zth post t1 on All t bo pos 1ttons are unde r Jdcn ttcal eondlltons q1 =
q2 == = qm_ 1 = q Tbc proh abtll t y q that ther e J9 no unco u pl1ng at the
tth posi tion 1s equa l to the prob abil ity that t\vO carr iage s correspond..
1ng to the tth post tlon have the sam e dest inat ion Adding up the
prob abJl l tte.s that thes e carr iage s are boun d for the 1st, the 2nd. • ,.
the 'h th po 1n t of dest 1na t ton, '\ e find
k A
~ "V m1 (m,- -O _ t ~ {m -i)
q- LJ m (m-1 ) - rn (1"1- l) LJ m., 1 '
1~1 i~1

k h

J\1 I },.J = m - t "' 1


( 1
m (rn- t) L.J m, ml- ) = ! ~ mt(r n1 -1) .
1........-1
1 1

Hen ce
A

i\1[ YJ =m -1- * ~ mt(m 1 -1)

S1nce the varu thles X and Y d1ffor by a cons tant . Var lXJ === Var [Y].
The vnrHlnce of the rand om "ar1 able y can he foun d f.rom formula
(7 0 28) for tl1e vo.rtance of the sum
m-l
Var [Yl= )j Var{J,J+2~ Cov11 ,v 1
~~! t<J

The eo" ar1ance of the rand om vana bles Y 1 Jtnd Y 1 can be found
from the form ula
Cov.,i 11 =~I rY 1YJ] -~[ l}..,J l\f [Y1 1= M {Y,Y1J-q s
The prod uct Y1Y{ IS zero 1f ther e rs an unco upli ng at leas t at one post-
tlon , and Is equa to unit y tf ther e Is no unco upl1 ng at etth er position
Hence
Ch. 7. Characteris tics of Functions of Random Variables 207

We seek the probabilit y that there is no uncouplin g at either position


(i or j). This probabilit y is different when the positions are adjacent
(j- i = 1) and not adjacent (j - i > 1). \Ve designate the probabilit y
as qad when the positions are adjacent and as qn.ad when they are not
adjacent. The probabilit y that there is no uncouplin g at two adjacent
positions is equal to the probabilit y that the ith, the (i + 1)th and
the (i + 2)th carriages have the same destinatio n
k k
'\:l mi (mi-1) (mi-2) 1 "' ( 1) ( 2)
qad = Li m (m-1) (m-2) = m (m-1) (m-2) Li mi mi- mi- ·
i=1 i=1
For two adjacent positions l\{[YiY 1·]=qad• Covy.y.= qad-q 2 • The
l J
probability that there are uncouplin gs at two non-adjace nt positions
(j- i > 1) is equal to the probabilit y that each pair of carriages,
between which this position is, has the same destinatio n. This may
happen either if all the four carriages are bound for the same place
(the Zth), or if the first pair is bound for the Zth place and the •second
pair for the rth place. The probabilit y of the first variant is
1 2
ml(mz- ) (mz- ) (mz- 3) and that of the second variant is
m (m-1) (m-2) (m-3)
mz(mz-1)m r(mr-1) Tl1e to t a 1 pro ba b"l"t h h . 1
m(m- 1) (m- 2) (m- 3) . I I y t at t ere IS no uncoup-
ing at two non-adjace nt positions is
k

qn.ad= m(m-i)( !- 2)(m- 3) ~ [ m,(mz-1 ) (m1 -2)(m1 -3)


1=1

+ ~ mz(m1-1)mr(m r-1)].
The covariance Covyiy i for two non-adjac ent positions is qn.ad - q 2 •
With due regard for the number of adjacent (m - 2) and non-adjac ent
(C~- 1 - m +
2) pairs of positions, we obtain
+
Var [X]= Var [Y] = (m-1) q (1- q) 2 (m-2) (gad- q2)
+ [(m- 2) (m- 3)- 2 (m-2)] (qn.ad -q 2), crx =Uy = lfVar [Y].
The range of the practically possible values of X is M [X] ± 3crx.
Remark. The calculations only have sense for m > 3. If in the formulas for-
qnd and qn.nd some of the terms are negative, then the correspondi ng terms are-
assumed to be zero.
"7.74."}._, A number of trials are made, in each of which an event A
(success·) may occur or not occur. The probabilit y that the event A
occur~ at least once in the first m trials is defined by a nondecreas ing-
fu.nctiOn R (m) (Fig. 7.74)*>. Find the average number of trials which
Wtll be made before a success is achieved.

*) The function R (m) is defined only for an integer m hut to make the figure-
1
c earer, the points in Fig. 7.74 haYe been connected hy'lines,
'208
....
,.1 , ptlt." t I
lEU
,.o bltli rru In Pro bab ilttv Th~orp

Solution. Let us a!Jsume that the trtals are not termtnated alter a
succeqs 1.s ach1c\ cd E~ch of them (e'tcept for the first) can bo "necessary,.
tf a succes~ 1s not Jet ach1e" ed, and "supcrOuousu if a success is achieved
Let us connect a random variable X 1 ~ tth each (itb) tr1al and assum(t
1 t to be un 1t ~ 1f the trial IS necessar} and zero tf 1t IS su perfl uou.s
\\'e cons1der the ran(lom vartab\e Z, the number of tr1a1s whtch mm\
be made before a success IS achieved It 1s equal to the ~urn of all the
random 'arutbles X h the frcst of \\ htch lS al\Va"}S equal to untty x1
(the first tr1al 1s al,vays necessary}
R(m)
1 ---
_____ _....-~-

Z=X 1 +2} X 1
OCI

i~2

The ordered ser1es of the random


1 ar1ablc A. 1 (t > 1) has the form
t f
I
j
I I I
l
0 1 1

1 2 J
' m R(t--i) I t-R(t~t)
(rf a ~ueces~ 1s ach1eved 1n the pre ..
crd1ng l - 1 trtals, the 1th tru1l IS
superfl uoust If tt IS not, then the
trtal Is necessary) The mean 'al ue of the random variable X 1 lS
l\trX~]=O R(t-1).f 1 11-R (t-f)J=1--R(t-1)
e can ovJdently use the ~same
\\
7
notatton for l\1 (X1 ) ~~I [1 I-
1- R (0) (R (0) ~ 0) Hence
ctO 00

\IIZJ=~ [1--R(l-1)J==: ~ [1-R(k)J


'i~1 ~=0

7. 75* For the cond1t1ons of the preced1ng problem find the varH\nee
<Jf the nuntber of tr~.als needed to achte' e a success
Solut1on The random v ar1.ab1es X 1 X 2 t , X h are tn u tually
dependent, and "e cannot~ therefore, simply add thetr vartances to
gether 'Ve must find the second moment about the ortg1n of the
'ar1abl~ Z
00 oc

'J (Z J== i\J 1(2: x~) 1 = ni A~l + 21\t [~ x.xJL


2
2

1=1
CL
'= 1 j-;?1

'flX1,~1~ f1-R(t ~1)}~1-R(l-1)


The vartable X ~X 1 1s un1ty 1f .A i and X 1 are both unit)~ 1 e both
tr1a1s {the tth and the ]th) ttre necessary F~r the t" o trtal~ (the zth
and the ]th) to bo necc(O,sary ~ a later (Jth) trtal n-._ust he neees~ar'
'h\ r,xiXJ) == ?\X 1 == \, X 1 = 1) = \ - fi \J- ~1)
Hence
00

l\f[Z1 J=2J ~~1


fi-R(l-1)1+22J (1-R(J-1)1
l>(
Ch. 7. Characteristics of Functions of Random Variables •209

The last sum can be calculated as follows: we first fix j and take a sum
over i from 1 to j - 1; then we take a sum over j from 2 to oo. Under
the summation sign over i, for a fixed j, all the terms are equal and
do not depend on i; their number is equal to j - 1. Therefore,
co j-1
~ [ 1- R (j -1)] = 2j 2j [1- R (j- -1)]
i<i j=2 i=1

co
=lj (j-1)[1-R(j-1) ]
i=2
00 ""

= ~ k[1--R(k)]= 2j k[1-R(k)].
k=1 k=O

Hence
co co co
Var(Z]= 2j [1-R(k)]+2 2j k[1-R(k)]-{lj [1-R(k)]}Z.
k=O k=O k=O

7.76. The probability that a target will be detected by a radar in-


stallation increases exponentially with the number of surveillance
cycles n:

P (n) = 1 - an (0 < a < 1). (7. 76)


Find the mean value of the number of cycles X after which the target
will be detected.
Solution. Setting a = 1 - p, we rewrite formula (7. 76) as follows:
P (n) = 1 - (1 - p)n.

We can see that the cycles are mutually independent and the prob-
ability that the target will be detected in each cycle is p = 1 - a.
Under these conditions the variable X has a geometric distribution
beginning with unity, and its mean value
l\I [X] = 1/p = 1/(1 -a).
We can obtain the same result using the solution of Problem 7.74:
co 00

M [X]=~ [1-P (k)] = ~ ak = i..:.a .


k=O k=O

7. 77. A radar installation scans a region of space in which there are n


t?rgets. During each surveillance cycle the radar may detect a target
(mdependently of the other targets and other cycles) with probability p.
How many cycles will be necessary (1) for the probability of detecting
all the targets to become no less than P, and (2) for the mean number
of detected targets to become no less than a given number m < n?
~~-0575
\
210 AppU~d Probl~m• ln. ProbabUUv Theory

Solution. (1) 'Ve designate the probabtltty of detect1ng all n target~


1n k cycles as Gn (k), Gn (k) = lG (k)]'', where G (k) IS the probab1hty
of detecting one target at least once 1n k cycles G (k) = f - (1 - p)k,
whence Gn (k) = [1 .- (1 - p)kJn 'Ve set Gn (k) ~ Pt
{1 - {1 - p)" ]n ~ P, (1 ....- p) 1 ~ 1 - VP, k log (1 - p) ~
log (1 - 1P) S1nce log (1 - p) < 0, the s1gn of the product
result1ng from the mult1pltcatton by tbts expresston IS reversed, hence
k ~ log (1 - (P)/Iog (1 - p)
(2) By the theorem on the addttton of expec\attons, the mean number
of targets 1\lfXJ detected during J, cycles 19 n [ 1 - {1 - p)"] Sett1og
n [1 ~ {1 - p}A] >
m and solv1ng the 1nequal1ty tor k., we obtatn

k~log (1-7)/tog(t-p).
7 . 78* \Ve make n 1ndependent tr1als under dtfferent condtttons,
the probabrl1ty that an event A may occur on the first,. the second, and
so on, trial Is p 1, Pz , Pn \Ve must find the mean value and variance
of the random 'artable X, wh1ch IS the total number of occurrences of
the event A To stmpltfy the calculations,. we average the probahd1tJes
p 1 and replace them by one, constantt probahllity
n
- i ~
P=-
n
LJ p,.
t=t
\Vtll I\1 [X] and Var [X] be calculated correctly?
Solution. 1\'i (X] w1ll be calculated correctly
n n.

l\1 {X) =np=* ~ Pt =~Pi~


i~t i=l
As to the var1ance 11 1t 'vtll be overestimated To prove this, we com·
pare the approximate expression for the variance
n
Var~=nfiq, where q=t-P==-} h (1-p,)
wtth 1 ts exact value
n
Var:e~ 1] P~:ql., wh-ere ql = 1- P.t
(=f
\Ve. transform the snm 1n two ways
n. n n. n
~ {p,- P} (q,- 9>=- 2j p,q,- 2.1
i--t i;lat 1.= I
Pt9-- 2J I
1=
pg, + npq
n
=}]
f~l
p,q,-npq,
Variable s
Ch. 7. Charact eristics of Functio ns of Random~== 211
·------~~~~~~~~~~~ ~~

n n
2j (Pt-P ) (qt-q)=~ (Pt-P )(1-P i-1+p )
i=1 i=1
n
= -LJ (Pi-P ) 2 -<. 0.
i=1

lienee
n ~

2j Piqi-npq=Varx-Varx~ 0;
1=1

and that is what we wished to prove. Note that the equalit y in V ar x


"'
>
Varx is attaine d only for p 1 = P 2 = · · · = Pn = P·
7.79*. Prove that if X 1 , X 2 , • • • , Xn are mutua lly indepe ndent, posi-
tive and similar ly distrib uted, then
k n •

M[~ x~fL; x 1
i=1 i=i
J=+·
Solution. Since all the variabl es X 1 , X 2 , • • • , Xn are positiv e, the
denominator never contain s zero. By the theorem on the additio n of
expectations
k ·n• k n

M[~ xif~ x1]=~ M[xif~ x1].


i=O j=i i=i i=i

Since all the variabl es X 1 , X 2 , ••• , Xn have the same distrib ution,
we have
n n
M[xtj~ X 1]=M [xmj~ X 1]
i=i j=1

for any i and m. We design ate their commo n value as a

At the same time it is clear that the sum of all the variabl es of the
n
form Xi j~ X 1 is equal to unity, and, conseq uently, the mean value
j=1
i" also unitv:

212 A pplted Pro bll'mz In Pro btJbtlttg Th tory

Replacing the e~prccs1on under the s1gn of the mean value by a "'e
n
get 2] a= n~ = 1 '"hence we ha\ e a-= 1/n Consequentll,
J-1
h n 14 11 A

't[h-'lfhx~J=~ \t[x~J~ x1 ]=h~=+·


Je 1 J-t i::r l J~t ~~t

and tba t 1S \V h 1 t \\ e 'v1 shed to prov~


7 80 There are n loads \vhose energy requ1rements per unit time
are 1ndependen t random variables X t X 1 , X n. '\\ tth the same
(arht trary) d lStrlhutton \\0 conc.;tder a ranaom "arlable Zt =
"
Xi/~ X t (t = 1 n}. "Inch ts the fraction of the total consurn.p
t~l

tton of energ} taken by the lth load Pro\ e that for an} L the mean
\ al ue of the random \ artable Zi IS J/n
Solution. It xs clear that the mean value of the vartable Z 1 extst.s
~nnce 1t 1s Included bet\veen zero and unaty The dtstr1but1uns of alJ
the random var1ables Zt (i = 1. n) are the same {s1nce the problem
Is completely symmetrical) and thetr mean values are equal r.rt (Z11-
1\1 (Z 2 1 = = 1\I IZn 1 Stnce the sum of all the random varrables z,
1s e:qual to u.n1ty \he sum of the1r mean values 1S also equal to un1ty
n n
~~ (L, X, l ~ ~ l\llZd == 1 tae n\I (Ztl = 1~
1~1 i=l
'If Z,] = 11n (:c.= 1 , n).
7 81 -'\n equll a teral trtangle \Vl th Side a = 3 em can be constructed
by dr '\ Wtllg a line ~egmen t of length a from an arb 1trary potnt 0 then
drawing another ltne segment a at 60° to
the first and connect1ng the endpo1nt of
the Qecon d ltne to the or tgtn (F tg 7 81)
The segments of length a are measured
"'1th a ruler graduated 1n mtll1metres
The maxtmum poss1ble error 1s thus 0 5 mm
oe 60 o The angle ts constructed w1th a pr~\rUactor
o.._..__ '\Jth a maximum po.sstble error of 1 s1ng
a the ltnear1z a tton method, find the mean
r 1 value and the mean square dev1atton of
F !3._!1, the length of the th1rd s1de X
Solutzon 'Ve des•gnato the actual length o£ the first s1de as X 1 that
of the second s1de as X 2 and the. actual angle as 8 These random varHt
!J,.les C&'•Y W C6"l~5'l.je,.-,_,~ ((J 1J"l! ma(aal/s rrnfti_{1t!au\rat ~''tr attil'\9'

A.= Y x;+ x:-2X1X 2 cos e


U.s •ng the l1near lt at 1o n method, we find
mx = vm~l + mi:l - 2mxtm~~ cos mot
Ch. 7. Characteristics of Functions of Random Variables 213

where mx 1 = mx 2 = 30 mm, cos m(} = 0.5, whence mX =


V900+900-900= 30mm. Furthermore,
ax ) - ( 1 2xl- 2x? cos {} ) m = _21 '
(
ax 1 m- 2 V xi+4-2x1 x 2 cos{}
ax ) ( 1 2x2-2xl cos{} ) - ..!._
( ax2 m = 2 V xi+x~-2x1 x2 cost} m- 2 '

( ax ) = ( J_ 2x1 x 2 sin {} ) = 30 V3 = 15 V3 mm.


ilfJ m 2 y xy+x~-2x 1 x2 cos{} m 2
We calculate
ilx ) 2 8x ) 2 ax ) 2 V
Varx ~
(
Varx2 +
(
( 8xl m Varxl -j- 8x2 'm {}f} m arl'}.
We do not know the variances of the arguments, we only know the
maximum practically possible deviations of their mean values: ~x 1 =
L'1x 2 = 0.5 mm; Mt = 1° = 0.01745 rad. Setting approximately <Yx, -
Cix, = ~x 1 /3 = 0.167 mm,
Var:~., = Var:~- 2 = 0.0278 mm 2 ,
Civ = MJ/3 = 0.00582 rad, R
o<._- _]}__ -!X
Varv = 3.39-10-5 rad, we ---
0
get Varx = 0.25-0.0278·2 +
675-3.39-10- 5 ~ 0.0368 mm 2 ,
ax ~ 0.192 mm. Fig. 7.82
7.82. The distance D from
a point 0 to an object R is
defined by measuring the angle a at which the object is seen from
the point 0 (Fig. 7.82), and then, given the linear dimension of the
object X, and assuming the angle a to be small, the distance is found
from the approximate formula
D = X/[2 sin (a/2)] ~ Xla.
The linear dimension of the object X, as seen from the point 0, may
change, depending of its accidental rotation, in the range from 8 to
~2m. The angle a is determined to within 0.0001 rad. The distanceD
IS large as compared to the size of the object X. Find an approximate
~ean square deviation an of the error in determining the distance D
If the measure of the angle a is equal to 0.001 rad.
Solution. Using the linearization method, we have
11 ( iiD ) 2 2 ( aD ) 2 :r~
<JD = ax m <J;r --j- aa m <Yo: •

\Ye assume the linear dimension X to be uniformly distributed in the


inten·al (8, 12): ax = (12 - 8)/(2 y3) = 2/y3 m, cri = 4/3 m2;
m.'C = 10 m. Furthermore,
1
(Jet~ 3 0.0001; _i_
(J 2 -
a:- 9
10-B·
' ma.= 0.001,
214 AJ pl ed PrcJbletns tn Pr ob 4b llft y Theory

\vl1ence
ob = (.!a.) m ai + (- . .-z-
2
) 2
o~ == ..! 3
.. 10 41
m2,
a2 m 9
;-
an = l
3
i 103 = 1 20 103 m
7 83 There are t\\ o alm os t I1near funct1ons of n ra nd
om arguments
Y = fPu (X tt X 2 , Xn ) an d Z = r.pt (X X2 , Xn) G1ven the
ch ara cte r1 stt cs of th e system m:r Varx (t ~ 1, 2, 1
l I
, n) and the
co rre lat io n ma tri 't If Co v 11 II fin d an ap pr ox tm at1 on
of the covar1ance
Co \yr
So lu tio n Lt ne ar1 z1 ng th e fu nc tio ns fP and <f'.a: we ge
11 t
n
mlf 8 } + 2} ( ~~: )m X,,
t:Eil

'vh en ce
n
o
z= "' {
LJ
oqJ, )
ox ; rn x,,
D

'1-1

n
= ~ ( acprJ ) ( o<pz ) Var + " ( ocpy ) ( arp~ Cov 11
l= i
a~1 m i}z i m #' LJ ox, m
i~J
OZJ
)
m
Th e las t su m co nta 1n s n (n - 1) ter ms
, an d ea ch Cov 11 1s as so cta ted
\V Ith t\v o of th e ter ms of th e su m
!! X
8 11
~ ) ( o<pz ) Co v
( iJ:r an d
i m OXJ m fJ

o<p Y )
( OXJ m ( acp z ) C
iJ:t1 m OV JJ•
o :r 7 84 Th e dt sta nc e R fro m a po tn t [(t o th e or1g1n
ca n he fo un d 1n tw o \Vays (1) th e dts tan ce s X an dY
fig 7 84 to th e co or dt na te ax es ca n be de ter mt ne d an d the n R
cart be found frotn th e fo rm ul a R =
or (2) on ly th e d ts tan ce Y to th e absc1ss a ax1s an d th e1 an x~ + Y 2., V
gle a need be
fo un d (F1g 7 84 ) an d th en R ca n be fo un d fro m th
e fo rm ul a R2 =
Y/ co s a \Ve mu st de ter m1 ne ,vh tch ,va y 'vt ll lea d to th
e sm all er error
1f th e dt sta nc es X an d Y an d th e an gl e a ar~ fo un d w1 th
mu tu all y In
de pe nd en t er ro rst "1 th th e me an sq ua re de v1 att on s of
th e err or s 1n X
an d Y be1ng cr x = au an d th at of th e er ro r 1n th e an gle
be tn g arJ.
Ch. 7. Characteristics of Functions of Random Variables 215

To do the num erica l calc ulati ons, we assig n valu es to the mea n devi a-
tions of the erro rs O'x = O'y = 1 m and O'a = 1° = 0.0174 rad and to the
mean values of the para mete rs mx = 100 m, my = 60 m and mcx. =
arctan (m,/my) ~ 59° ~ 1.03 rad.
. (1) 8R1 X
Solution. ax = V xz+y z ,

) m cr~ > cr~, cr2 = VVa r [R 2 ] >


2

Var [R 2 ] = ( cos; a;
)
m
cr~ + ( ta~
yzcos a
a;
O"p

Thus the cond ition a 2 > a 1 is fulfilled for a > 0. For the num erica l
data of the prob lem we have
1 2
a1 = Gy = 1 m, cr 2 = ([ 1 + ( : ) ]

X [1 -j-60
2 2
(
1
°g )
6
2
0.0174 2
Jr' =3.9 m.
2

7.85. A syst em of thre e rand om vari able s X, Y, Z has mea n valu es


mx = 10, my = 5 and mz = 3, mea n squa re devi ation s ax = 0.1,
cry = 0.06 and O"z = 0.08 , and a norm alize d corr elati on mat rix
1 o. 7 -0. 3
11 r 11 = 1 0.6 •
1
Using the linea rizat ion meth od, find the mea n valu e and the mea n
square devi ation of a rand om vari able U = (3X 2 1)/(Y2 2Z2 ). + +
+
Solution. mu = (3·1 00 1)/(25 2. 9) = 301/43 =• 7. +
au - 6x au (3x2 +1) 2y.
ax- y2+2 z2 ' ay =- (y2+2 z2)2 ' •

2
au - - - (3x -!-1) 4z (au ) - 6·10 -1 4 .
az - (y2 +2z 2 ) 2 ' ax m- 43 - • '
au ) 301·1 0 ( au ) 301·12
( ay m = - (43)2 = -1.6 3
and az m = - (43)2 = -1.9 5,
Var [U] = 1.42· 0.12 + 1.632· 0.06 2-!- 1.952. 0.082
+ 2 [ -1.4 -1.6 3- o. 7. 0.1. 0.06 + 1.4- 1.95 · 0.3· 0.1- 0.08
-1- 1.63-1.95 · 0.6 · 0.06 . 0.08] ~ 0.066; O"u ~ 0.26.
7.86. Two arbi trary resis tors R 1 and R 2 are conn ecte d in para llel.
The rated valu es of the resis tors are the sam e and equa l to mr =
mr. = 900 Q. The max imu m erro r in R whic h can arise whe n the ro'sis-
to:s are man ufac ture d is 1 per cent of the rate d valu e. Use the linea riz-
atlOn meth od to find the rate d valu e of the resis tanc e for the two resis -
216 App1ted Proble ms h~ Proba bihfy Theor y

tors tn paral lel and tts mean squar e devia tton

So!ub on. R-= R~tR2 =«p( R,. R,). mr-r p(m, ,,m,. )=45 0 Q,

Ort = O"rt =-} ;gg = 3 Q, ( ~~ ) m = [ (r1 ~~r2 )' ]"' =+'


acp ) _ ( a<p_) __ J...
( iJr, m-- {Jr 1 'nl-- 4 '
2
Var[ R}= ~ ( -:~- ): o:,={ Q~, Or~ 1.06 g
(..m1

In th1s case the max 1m um error IS 3 2 0, and that IS 0~~ 7 per cent (rather
than 1 per cent) of the rated value
7.87.. The reson ance frequency f r of an osclll a tory CltCU l t is found
trom the expre sslQn fr = 1/(21t )f LC), \Vhere L is the 1nduc tance c.f
the CitCUlt and C ts the capac1 tan co of the CJrcur t Ftnd an appro
x1mat1on for the mean. value of the reson ance frequency of the
CitCUlt and Its mean square devia tion 1f ml = 50 p.H, me = 200 pFt
cr, = 0 5 }l-H ana Oc = 1 5 pF
Solution~~ m1r = f/(2n V m~mc) = 1 59 f'H.
8/r) -1 -t
( Bz m = 2nmb'2rnf122 m,r 2ml i =
olr ) __ -1 -m
( oC n~- 2Rm lf2m3122 __. 1r
-1
2mc ~
l c
m!
Var, = ( 8fr ) 2 ol + ( oG
8/r )2 at
c
= t,
4
t ol m m

-
=mJr 5 )2 .to-•
(T

5
air== mf-r. s ·1D-2 = 1~0· 10-2 l'Hi
and tbts 1s 0.62 per cent ol the rated frequency
CHAPTER 8

Distributions of Functions
of Random Variables. The Limit Theorems
of Probability Theory

8.0. If X is a continuous random variable with a probability density I (x), all!P


a random variable Y is in a functional relationship with it
y = !p (X),

where !p is a differentiab le function, monotonic on the whole interval of the pos-


sible values of the argument X, then the probability density of Y is expressed by
the formula
g (y) = I (~, (y)) I 'i'' (y) I, (8.0.1)·
where \jJ is the inverse of tp.
If cp is a nonmonoton ic function, then its inverse is nonsingle-v alued and the·
probability density of Y is defined as the sum of the terms of the form (8.0.1),
the number of summands being equal to the number of the values (for a given y) of
the inverse function:
k
g (y) = 2j I {lJlj{y)) llJl~ (y) I, (8.0.2)·
t=i

where '1'1 (y), \jJ 2 (y), .•. , lJlk (y) are the values of the inverse function for a given y.
For a function of several random variables, it is more convenient to seek the
distribution function rather than the probability density. In particular, for a func-
tion of two arguments Z = !p (X, Y) the distribution function

G (z) = ) JI (x, y) dx dy, (8.0.3)•


D (z)

~here I (:z:, y) is the joint probability density of the variables X and Y, and D (z)
IS the domain on the x, y-plane for which cp (x, y) < z.
. ~he probability density function g (z) can be determined by means of differen-
tiatiOn of G (z): g (z) = G' (z).
The probability density of the sum of two random variables Z = X Y is +
expressed by either of the formulas
00 00
'
g (z)= J I (x, z - x) dx, g (z) = ) I (z-y, y) dy, (8.0.4)
-oo -oo

where I (x,, y) is the joint probability density of the variables X and Y.


th In particular, when the random variables X and Y are mutually independen t
en I (:z:, y) = 11 (x} / 2 (y) and •
00

g(z)= \ ldx)l 2 (z-x)dx, (8.0.5)•



-oo
or
00

g (z)= \ fdz-y)f 2 (y) dy. (8 .0.6)·


~

-oo
::218 Applltd Problem• ln Probabllltv Th.eorv

In that case the d1strthubon of the sum g (.z) 1s called the conuolutfan (compo
~ltlon) of tb.s dlst·nbutto.ns c.f the terms Is (~} and / 1 {y}
If a randam varfable htu a normal dutrlbutton and Is 8ubje~ted to a ltnear tram.
formation. thtn the r~sulttng random. vartablt wtll agaIn have a normal diltrtbutfon
In part1eular. d a ra.ndoiil vartable X has a normal dtstttbntlon w1th parameters
m.rr a.x• then the random var1able Y ~ aX +
b (where a and bare nonrandom) has
+
a normal dtstr1huhon wtth parameters mu = am3: b. a11 == l a (a~
A t:onvulution of two normal di..rtrtbutlon funttfons / 1 (z) and / 1 (g) wilh t:harat
tertsti.c3 m.x, Ost m 1p av rtsptctively resultl ln a normal dlstr,button wltk chitrac--
ttrtstics
(8 0 7)

The add1tion of two normally dastrabuted random var1ables X, Y w1th parameters


mx~ al!f m11 t au aud the carrela t1on eoefficJen t r 3:rJ results 1n a random var1allle Z,
~hlth al.ra has a norm a l d l.t t rtb u t lo n 1D Uh t: ha raet eris l k1

mz == mx-+m11 , O'z == Y ol +a~ +2r.:tyO'~Ov (8 0 8}

A ltoear funehon of several 1nd~pendent normally d1str1buted random vartable!


x t• x It ~ • x'f\

where a 1\. b are no nr a ndo nt coe (fie rents. also has a normal d[s t r tb u t 1011 wJth pa
cameten
ft

rnz == ~ a 1mr, + b, (B 0 9}
1~1

where mxl' a~ 1 are the parameters of the random var1ahle X 1 (f = t, n)


If the arguments X1 X1. ,. Xn are eorrellated~ then the dtstrthutton of the
ftnear function rema1ns normal but has parameters
R n
mz= lJ alm.!'i+b,. D'z~ iJ a:oi~ +2 2} a 1 a 1 r~ 1 %Jo~ 1 axp (8 0 10)
t~t l~t i<i
where rx1xJ 1s the correlahon coe!fictent of the var1ables x1• x1 (l = J, .... n,
I =F t)
The conuolu.tlon of two normal dlttrr.buti.ons on a plane IS the dtstr•hution of a ran
dom vector (X, Y) wtth cornponents X == XJ X 1 t Y = Y1 +
Y 1 , \Vhere (X1, Y~} +
(X 1 , Y 1;) are uneotr~lated random voctors. (rl:IXa = r :rdl• ~ r 111 %, = r Vdl• l =
A con.r,.olution of two normal dtstt,butions on a plane results tn a normal dutrf...
6utton wfth param~ters
mx == m~" + mx•' mg = ""u • + mJI.~'
'Ox= lf cfl.:xl + o'lxst ou~Vo~ 1 +0~ 2 , (S 0 tt)

I '\Vheace
Covx 11 = Cov.;tHJt +CoV'.:r 2112 "
rxw = {rxfp {1'$-\rJY l +TxzY 2f1-:::.{Jy~_)/(C ,.,_a y} (8 0 12)

\Vhen a random pOint (X Y), wh1ch IS normally dtstrtbuted on a plane. rs pro..


~ecten onto the z ax•s~ wh1ch pass~s through the centre of sca\terlng and at an
.angle a to the :r.axts~ a random polnt Z results wb.1ch has a norillal d1strlhut•on
Ch. 8. Distribution s of Functions of Random Variables 219

with parameters

(8.0.13)

Probability theory limit theorems form two groups: (1) a law of large numbers,
and (2) a central limit theorem. The law of large numbers has several forms, .each
o()f which establishes the stability of the average for a large number of observat10n s.
1. Bernoulli's theorem (the simplest form of the law of large numbers). As the
number n of independent trials, in each of which an event A occurs with proha- 1
hility p, tends to infinity, the frequency P~ of the event A converges in probability to
the probability p of the event:
limP {IP~-pl <e}=1, (8.0.14)
n-+oo

where e is an arbitrarily small positive number.


2. Poisson's theorem. As the number n of independen t trials, in each of which
an event A occurs with probabilitie s p 1 , p 2 , • • • , Pn• tends to infinity, the fre-
-quency Pi; of the event A converges in probability to the average probability of the
-event:

(8.0.15}

3. Chebyshev's theorem (the law of large numbers). As the number n of indepen-


·dent trials, in each of which a random variable X, with expectation mx, assumes
.a value Xi, tends to infinity, the arithmetic mean of these values converges in
probability to the expectation of the random variable X:
n

lim P {
n-+oo
~ *•=L} •
1
Xi- mx I< 8} = 1. (8.0.16)

4. Markov's theorem (the law of large numbers for varying experimenta l condi-
.tions). If X1 , X 2 , ••• , Xn are independen t random variables with expectation s
mX.t' mx 2, ••• , mxn and variances Varx,• Varx.• ... , Varxn• all the variances
bemg bounded from above by the same number L, i.e. Varx 1 < L (i = 1, .•• , n),
then, as n tends to infinity, the arithmetic mean of the observed values of the ran-
-dom variables converges in probability to the arithmetic mean of their expectation s:
n n

;~~ P {I+.~
t=i
Xi - +.~ t=i
mxi / < E} = 1. (8.0.17)
••
\

When the speed of convergence in probability of various averages to constant


·values is to be estimated, Chebyshev'.~ inequality is used:
P {I X - mx I ~ tt} ~ Var)a2 , (8.0.18)
wh~re a> 0, and mx and Var.x are the mean value and variance of the random
\'arwble X.
The central limit theorem has different forms of whicJ1 we give three .
. 1. Laplace's theorem. If n independen t trials are made, in each of which an
1!~ ent A has a probability p, then, as n -. oo, the distribution of a random variable
X, the number of times the ~event occurs, tends to the normal distribution with pa-
Tameters m= np and a= V npq (q = 1 - p). \Ve can go on from this to calculate
:tho probability that the random variable X falls on any interval (a, ~): for a suf-
220 Applied Problemr In Probttbillty Theory

fJ Cl en tl y large n

f{XE(a jl.)}x~ { P;:n~


l npq
)-<Il{ ;np),
npq

I nstrad of forrn ula (8 0 191 \\ e o!ten u U} 1ze the C"\: pre~c:ion for the prob<thlitty
that the normah:ted varJable•
Z = (Y - mx)la:r t= (X - np)IJ' npq m.z: == 0 Uz == f
(ra th.Pr than the r9 ndom vnnahle X lfc:elf) falls on that Jnterval For a sufficJentl:v
large n
P {Z E (a; ~}) :::: ci> (~) - a> (a) {8 0 20)

2 The ctntrol llmtt tht>or~m for stmllarly dlstrtbuted ttrms It X 1 X, , Xra


are similarly dJstrJbutpd 1ode-pe:odent random var1ables With mean value m~ and

mean square d evHI tlon 0'~; then for a suifi<aently large n, thetr tum
ha1 an appro:rrma ttly normal dlstr1bution '\\ 1th ... Jlarameters

my .=; nm~ all = vn CJ"( (8 0 21)

3 Lyoputcvs theorem. If Y1 X M Xn are Jndependent randoltl var1ables


wtth mfan \alues m';( m;c 1 ; m:t n and 'arJances Vttr........ , Var........ V.ar:t n,
and the restrtctton

(8 0 22)

~hete b, = ~f {\ X 1
n
l3 l ~s sah~fii!d tb~n {ot a sufntlt-n'\l:i latg~ 11 the random
vanahle Y = ~ X t has an appro.ztmately normal dutr1butfon \Vlth parameters
-.~o

(8 0 23)

The s-ense of th-e rest t ~ c t 1ou (8 0 22} IS that th~ rand om v a r1a bl es sbcul d be c.om
parable as conc~rn.s the order of thetr ~ffPct on the c:cattPrtng of the sum

Problenls an(l ExercTses


8 1 A cont1nnous random variable X has a probabilttl denstty
I (.x) F1nd the prohabiltty dens1ty function g (y) of the random vartab1e
l' = aX + b ,vhere a and b are not random
Solution Stnce the function ~ (x) ==== ax + b IS monotoniC, '' e can
~rtR. f~w~. .~ \fb Q, \.\ ~~tb ,~"1~~-:m~~ ".\'ui~"'\m~ ~'(a) t'bii "iJt ~tJu"tli. ~. .j ~Jr.'-tl!L.~
the equat1on y ax = +
b With respect to x we get 'P (y) = (y - b)/a
\Ve now find that 'i> (y) = 1/a and i >tp' (y} ) = 1/} a i~ hence

g ( y) =
t
laf J ( 11-b )
a (8 i}
Ch. 8. Distribution s of Functions of Random Variables 221

8.2. A random variable X is uniformly distribute d in the interval


(-rc/2, rc/2). Find the distributio n of the random variable Y = sin X.
Solution. In the interval (-n/2, n/2) the function y = sin x is mono-
tonic and, therefore, the probabilit y density of the variable Y can be
found from formula (8.0.1): g (y) = f ('ljl (y)) I 'ljl' (y) 1. It is convenien t
to arrange the solution of the problem as two columns, writing the
designations of the functions in a general case on the left and those of
the specific functions corresponding to the example on the right, e.g.
f (x) 1/n for x E (- n/2, n/2)

Y= q> (x) Y=SlllX

X=tf{y) x=arc sm y
'ljl' (y) 1/V1-y2
g (y) = f ('ljl (y)) 'ljl' (y) g(y)=1/ (nl11-y2 ) for yE(-1, 1).

The interval (-1, 1), which includes the values of the random variable
Y, is defined by the range of the function y = sin x for x E (-n/2, n/2)*>.
8.3. A random variable X is uniformly distribute d in the interval
(-rc/2, n/2). Find the probabilit y density function g (y) of the random
variable Y = cos X.
Solution. The function y = cos x is nonmonot onic in the interval
(-n/2, n/2). We find the solution by analogy with the preceding prob-
lem, only in this case the inverse function will have two values for
each y [see (8.0.2)]. We again arrange the solution in two columns:

f (x) 1/n for x E (- n/2, n/2)


!Y = cp:(x) y =cos x
'ljl 1 (y) x 1 = ·-arc cosy
X=
'ljl2 (y) X 2 =arc cosy

l'ljl~. (y) I .-l'ljl~ (y) I 1/V1- y 2


k
g (y) = 2j f (~; (zt)) l'ljli(y) I g (y) = 2/(n lf1- y 2) for y E(0, 1)
i=!

8.4. A random variable X is uniformly distribute d in the interva1


(-n/2, ;r./2). Find the probabilit y density function of the random vari-
able Y = 1sin X 1.
A~ver. g (y) = 2/(n lf1 - y2 ) for y E (0, 1).
8.0J. A random variable X has a probabilit y density f (x). Find the
probability density function of the random variable Y = 11 - X I·

th *l In w?at follows, when ~~lving si:nilar problems, we shall write everywhere


c expressiOns for the probab1hty dens1ty only on the interval where it· .
assurnmg that it is zero outside the interval. '
'
222 Ap pl ed Prob lt m1 tn Pro babl ltty T htor y

8 6 A con tinu ous ran dom var iabl e X has a pro bab lltty dens1ty
1~ (x) \Ve con stde r the var1 able Y = -X F1nd tts pro bab ility density
fune t1on / 11 (y)
Ans l\er / 11 (y) - fx (-y )
8 7 A con tinu ous ran dom var1 able X IS unt
form ly drst r1b uted on the Inte rva l (a. ~) (~ > ")
I o I F1nd the dist ribu tion of the var tabl e Y = -X
Answer It lS untf orm on the Interval (-P -a)
8 8 A con tinu ous ran dom variable X has a
pro bab tl1t y den s tty f" (x) Ftn d the probab1hty
F•g 8 10 den sity fun ctio n f 11 {y) of 1ts mod ulus Y = ] X I
Answer fu (y) = f:x (-y ) + fs (y) for y > 0
8 9 A random var tabl e X has a nor mal dist ribu tion wtt h para met ers
mx O'x Wr ite the pro bab ility den stty fun ctio n of Its mod ulus y-
I X I In par ticu lar wha t wtl l be the dist ribu tton for m::c = 0?
Ans wer

f,(y )= ax ~2n {exp [ - (y1:t>' J+exp [ - (11-;~s)' ]} for y> 0


If mz =O then

f,(y )=
Ox
;-
2n
np (- :',Ox)f or y> O
8 f 0 A Circ ular whe el fixed at 1ts cen tre 0 (Ftg 8 10) 1s rota ted
the rota t1on betn g dam ped by fric tion As a resu lt a fixed potn t A
on the rim of the whe el stop s at a cert ain heig ht H (pos1t1ve or negat1ve)
from the hor izon tal l1ne I I wh1cb passes thro ugh the centre of the
"he el The heig ht H dep end s on the ran dom ang le e at wb1ch the
rota tion term inat es F1n d (1) the dtst ribu tion of the heig ht H and (2)
the dist ribu tion of the abs olut e valu e of H
Solu t1on H = r s1n 9 'vbe re the ang le E> 1s a ran dom vari able
uni form ly dtst rtbu ted 1n the tnte rva l (0 2n) The solu tion of the prob
lem Will evi den tly not cha nge 1f the ran dom var tabl e 9 1s assumed
to be unt form ly dist ribu ted 1n the Inte rva l {-fi /2 n/2 ) then H IS a
monotOniC fun ctio n of E)
Ch. 8. Distribu tions· of Functions of Random Variables 223

The distrib ution density of the variabl e I H I is


'
- -
g(h)~-(nrJ/1-
2 1
(+) )- for -r<h <7r.

The distrib ution density of I H I is


' g1 (d)= 2(nr y1- (fr r 1
.for O<d< r.

8.11. A random variabl e X has a Raylei gh distrib ution with prob-


ability density f (x) = iz
exp ( - ;;2 ) for 'x > 0. Fin~ the prob-
ability density functio n g (y) of the variabl e Y = e-xz. '

IY J
1
••
2

a B
0 1 !/ 0 X 8{C •

Fig. 8.11 Fig. 8.13 Fig. 8.15


'

' •

Solution. On the interva l of the possible values of the argume nt the


function y = e-xt is monoto nic. Applyi ng the genera l rule, .we get

f (x) ; 2 exp ( - 2: 2 ) (x > 0) ·


Y=q>(x) y=e-x z
·-..,....--
X='¢ (y) x=V -lny •

1'1>' (y)l (2y lf -In Yt 1


g (y) = f('¢ (y)) I'll'' (y) I g (y) = 2
: 2Y exp ( ~;:' )
1-2all
1
= 2crll y 2a for 0 <!Y < 1.

The graphs of g (y} for different cr are given in Fig. 8.11.


?·.12. A r~ndom variabl e X ~as a Cauchy distribu tio.n with prob-
ab~l~ty density f (x) = [n (1 x~)l- 1 ( - oo < x < oo). Fmd the prob-
+
nhihty density functio n g (y} of the inverse variabl e Y = 1/X.
Sol~tion. Bearin g in mind that despite the discon tinuity of the-
func~lOn y = 1/x, the inverse functio n x = 1/y is single-valued and
solviDg the problem according to the rules for a monoto nic furi~tion ,.
:2.'24 Applied Prob{em:s in Prof.usbiUtv Theory

-we get
t
g ( y) =
n
rt+y-J
1
1 2 "'!I
1 e the tnver.se of a uartab le whtch has a Cauchy dtstributton also has a
Cauchy dtstr,button
8 13 Through a po1nt A on the y axts at a unat dtstance from the
or1g1n a stratght I1ne A B Is dra,vn at an angle a. to they axis {FJg 8 t3)
All the magnitudes of the angle et from -tt/2 to n/2 are equtprobahle
F1nd the probab1ltty dens1ty funetHln g (x) of the absclssa X of the polnt
B \vhere the stratght ltne cuts the absc1ssa axts
Solution X ===- tan ~ tht~ funct1on IS monotontc on the Inter\ al
-n/2 < a< n/2 \Ve have g (x) = ltt (1 +
z 2)] 1 (-co<~< oo}
11 e the random varu.lble X has a Cauchy d1strtbution
8 14 A dlserete random qarlable X has an ordered ser1eS
-2 l- t 1 o ' t 1 2
ot]o2]oafoglot •
tC.onstruct the ordered seruas of the random varutbles Y = X1 1 +
Z = JX I
Solution For each X \Ve find the respective value of Y and Z and
place them 10 1ncreastng order \Ve get an ordered sertes
1 ' 2 f 5 o 1 t f 2
y z
oaloslo2 oaloslo2 •
8 15 A stra1ght l1ne AB IS drawn through a pOint A wttb coord1nates
(0 1) at a random angle 9 to the axis of ordtnates (Ftg 8 15) The
distrlbUtton of the. angle e has the form f (9) ~ 0 5 COS{} for -n/2 <
-(} < n/2 F1nd the dlstrtbutton of the d1stance L from the l1ne AB
to the or1g1n
Solution We have L = l s1n E> l The funct1on l = 1s1n ft 1 IS non
monotonic on the 1nterval (-nJ2 n/2) Applying the usual scheme for
.a nonmonotontc functton we obtatn
1
g (l) = V i - t3 cos (arcstn l)
-or tak1ng Into account that eos (arcsin l) = V
1 - l 2 've have 8 (l) =
1 for l E (0 1} 1 e the d1stance L has a uutfnrm d1strtbut1on 1n the
tnterval (0 1)
B 16 The radius R of .a cJrcle JS a random v.a.r1ahle \Vhtch has a Ray
1etgh dtstrihutton

f(r) = ;, exp { - ;;. ) for r >0


Ftnd the dtstrihUtion of the area S of the circle
Ch. 8. Distributions of Functions of Random Variables 225

Solution. The function S = nR 2 is monotonic on the interval of


possible values R (0, oo) and, consequently,
1
g (s) = 2l't(J 2 exp ( - 2l't(J~
s n ) for s > 0,
i.e. the distribution of the area of the circle is exponential with pa-
rameter 1/ (2ncr2 ).
8.17. A random variable X has a distribution function F (x), and
a random variable Y is in a functional relationship with it: Y =
2- 3X. Find the distribution function F (y) of the random variable Y.
Solution.
F(y) = P{Y < y}=P{2-3X < y}= p {X>
2-;Y}
2 2
=1-P{X~ ;Y}=1-F( -;-Y)-P{X= 2-;Y}.
If the random variable X is continuous, then the probability that
it assumes any particular value is zero, and
2
F (y) = 1-F ( -;Y).
If X is a mixed or discrete variable, then P {X = 2-;Y } may be
nonzero and equal to a jump of the function F (x) at a point
(2- y)/3. Thus, in the general case,
)+ l~~0 [F( +~x)-F( -;-Y )].
2 2 2
F(y)=1-F( -;Y -;Y
8.18. Given a continuous random variable X with a distribution
function F (x) (Fig. 8.18), find and construct the distribution function
G (y) of the random variable Y = I X 1.
Solution. For X > 0 we get Y = X;
for X < 0 we get Y = -X. F(x}, G(!J}
1 --------
G (y) = P{Y < y} = P{l X I< y}.
For y < 0 we get G (y) = 0; for
Y > 0 we get G (y) = P{-y <X <Y}=
F (y)- F (-y). The function G (y) is
shown by a dash line in Fig. 8.18 [it
then merges with F (x)]. 0
8.19. A mixed random variable X
assumes two values, each with a nonzero Fig. 8.18
probability: a negative value (-x1 ) with
probability p 1 and a positive value (x 2 ) with probability p 2 • In the
mterval between -x1 and x 2 the distribution function F (x) is contin-
uous (Fig. 8.19a); x 1 < x 2 • Find and construct the distribution
function G (y) of the random variable Y = I X I·
Solution. For y ~ 0 we get G (y) = 0 (Fig. 8.19b); for y > 0 we get
G (y) = P{-y <X< y} = F (y)- F (-y). To construct G (y), we
IS-05i!i
226 Applied Problemr tn Probability Thtorv

must subtract from each value ofF (y) the value of the func tton at the-
potnt -y, 'vhtch ts the m1rror reflection of y about the axts of ord1nateg
(see F1g 8 19b where F (x) tS sho,vn by a dash ltne and G (}j) by a .sohd
ltne) The funct1on G (y) Will hn\ e two Jumps at p01nts z 1 and x~H
whtch are equal to Pt nnd p,. rcspect1 vel y

F(x) F(xJ Gty)


-.-....- ........ ~
j I
p? P~

/
/ / }Pr

p/_
x, 0 x1 Iz
((1) (b)
r~g s t9
8 20 A random variable X has a normal dJstributton 'v1th pa
rameters m and a (Ftg 8 20a) F1nd and construct the probabtltty dens1ty
functton g (y) of Its modulus Y = J X I
Solution To construct the probahtltty dens tty of the random vartab1e
Y, ''e must sum up each ordtnate of the d1strthutton curve of/ (x) w1th

f[x) g(f)

0 0 .:ry
9f!J}
(a) (b)
flg s 2.0
the ord1nate corresponding to the value of the probabl11 ty density at the
potnt :r (see F1g 8 20a) For m = 0 new density IS double the prob
tth1l1ty density f (x) (F1g 8 20b)
8 21. A random variable X has a normal distrlhutton with parameters
=
m 0, a F1nd and construct the probah111ty dens1ty funct1on g (y)
of the random var1able Y = X 2
Solubon I (x) = ~-
a 2n
exp ( - pu

2a
::rl3 -) ., y = tp (X)== x:t. The Inverse
functron of y === x2 has two values z1= -Vii and x = + l'Y· By
2
Ck. 8. Distributions of Functions of Random Variables 227

formula (8.0.2)
11 1 r- 1 1 1 { ( y )
g(y)=f(v -y) 2VY +f(v y) 2Vii = 2Vy] exp - 2cr2

+exp ( - 2~2 )}= Jii exp ( - 2~2 ) for y> 0.


For y = 0 the probability density function g (y) has a point of discon-
tinuity of the 2nd kind (it tends to infinity, see Fig. 8.21).

!!
\
\
\
\ f(x)
\
\
" X

0 !I -1 0 1

Fig. 8.21 Fig. 8.23 Fig. 8.24

8.22. Given a continuous random variable X with a probability


density f (x), find the distribution of the random variable
+1 for X>O,
Y=signX= 0 for X=O,
-1 for X <0,
and its numerical characteristics my and Vary.
Solution. A discrete random variable Y has only two values: -1 and
+1 (the probability that Y = 0 is zero).
0
P {Y = -1}= P {X <0}= Jf (x) dx=F (0),
- c:o
P{Y= +1}=P{X>0}=1-F(O),
+
my= -1·F (0) 1·f1-F (0)] = 1-2F (0),
CG2 [Y] = 1-F (0) +
1-[1- F (O)J = 1,
Vary= cx 2 [Y] -m~= 4.F (0) [1-F (0)],
where F (x) is a distribution function.
8.23. There is a continuous random variable X with probability
de.nsity f (x). Find the distribution of the random variable Y =
2
!Inn {X, X }, i.e. of the variable that is equal to X if X< X 2 and to X2
1f X 2 <X.
'228 Appl ed Problem• In Prob~bUUy ThttJrv

Solution The funCtion Y = c.p (x) IS monotonre {shown by a sohd


ltne 1n F1g 8 23)
x1 for z E(0, 1),
<P (x) =
for z Et (0- 1)

Stnce the Interval (0, 1) of the :t axis lS mapped onto the Interval (0, f)
of the y ax1s, 1t follows~ by the general rule, that

I (]ly)/(2 V y) for YE (0. 1)


g(y)=
I (y) for va: (0, i)

8 24 A random variable X has a probabtllty denstty f (z) defined


by the graph (F1g 8 24) A random vartable Y ts. related to X as Y =
1 - X 2 Ftnd tho probabtllty dens1ty functton g (y) of the variable Y
Sol ut1on The probab1lt t y denst t y f (x) lS defined by the funct1on
I (x) = 0 5 (x + 1) for X E (-1 +1) The function y = 1 - r IS non-
monotoniC on that Interval the 1nvcr.se of the function bas two values
:c1 = - V1 - y x 2 == + V 1 - y Hence

g(y)::::: (4111-y) 1
r<t- V t-y>+<t+ Vt-v>J
or

8 2'l A random var1able X has a probabtltty dens1ty f (x) F1nd the


probabtlity denstty g (y) of Its 1nverse Y = 1/X
Solution Although the funct1on y = 1/x 1S nonmonoton1e 1n the
ordtnary sense of the 'vord (for x = 0 It Increases JUmpwise from -oo
to +co), Its Inverse IS stngle valued, and th1s means th.at the problem
can be solved 1n the same 'vay as for monotonte functtons,

g (y) =I { ~ ) :. •
for the values of y 'vh tch can he In verse to the g1 ven set of possible
values of x
8 26 The log normal dtstrlbutton The natural logarithm of a random
var1able X has a normal dtco=trtbutlon w1th a eentte of scattetLog m and
a mean square devtatton a Find the probabtllty density of the vart
able X
Solutton. We designate the normally distributed variable as U
and have

U=lnX, X=e u, f
f(u)= o'V2i exp [ - (u-m)'
20 •
J

Ch. 8. Distributions of Functions of Random Variables 229

The function eu is monotonic:

g(x)= 1
,r- exp
cr r 2n
[
- (lnx-m) 2
2cr2
J
-
1
x
-- 1
.. r - exp
xcr v 2n
[ - (ln x-m)2
2 2
cr
J for x > 0.

This distribution of the variable X is known as log-normal.


8.27. A spot Sp representing a target on a circular radar screen may
occupy any position on it (Fig. 8.27), the probability density of the
coordinates (X, Y) of the spot being constant within the screen. The
radius of the screen is r 0 • Find the probability density of the distance R
from the spot to the centre of the screen.
Solution. We find the distribution function G (r) = P{R < r} =
P{(X, Y) E Kr}, where Krisa circle of radius r with centre at a point 0.
Since the probability density is constant within
the screen, the probability that the spot will fall in
the circle is equal to its relative area:
G (r) = (nr2 )/(nrij) = (r/r 0 ) 2 , whence 0
g (r) = G' (r) = 2rlr6 for 0 < r< r 0 •
8.28. A random variable X is uniformly distrib-
uted in the interval (0, 1). A random variable Y Fig. 8.27
is in a monotonically increasing functional relation-
ship with X: Y = <p (X). Find the distribution function G (y)
and the probability density g (y) of the random variable Y.
Solution. We have f (x) = 1 for x E (0, 1). We designate the inverse
function of y = <p (x) as tP (y). Since <p (x) increases monotonically, it
follows that g (y) = f (tl' (y)) 'i'' (y) = ti'' (y), whence G (y) = tP (y), i.e.
the required distribution function is the inverse of <p (in the range of
possible values of Y).
. 8.29. To what transformation must the random variable X, which
IS uniformly distributed in the interval (0, 1), be subjected in order to
obtain a random variable Y which would have an exponential dis-
tribution g (y) = t..e-'-v (y > 0)?
Solution. Proceeding from the solution of the preceding problem, we
set Y = G-1 (X), where G-1 is the inverse of the required distribution
function G (y) of the random variable Y. We have
y
G (y) = Jt..e-'-v dy = 1- e-'·Y (y > 0).
0
. Setting 1 - e-'-v = x and solving this expression for y, we get the
m~·er~e function y = -(t..)- 1 In ('1 - x). Hence the required relation-
ship 1s
Y = -(t..t 1 ln (1 - X) (0 <X< 1).
A ~;,30. A random variable X bas an exponential distribution f 1 (x)
x (x > 0). To what functional transformations must it be sub-
0
230 Appl ed Problemt fn ProbalHllty Theory

Jccted to reduce 1t to a random vartable Y whtch 'vould have a Cauchy


d1stt1hu\10n f z {y) = {n (1 + y1 )] 1
Solution
Fdx)=l-e b(x>O) F!(y)= !-[arctany+ ~]
Settmg !
[arctan y + J
~ = u and solvmg th1s equatton for g
v. e find the tn verse fu nc t1on F, 1 ( u}
y = F 1 1 (u) = tan (ttu - n/2) -cot '"1 u =
From the precedu1g problem lVO get
Y - F; (F1 (.X)) - --cot ~ (1 ~ e )..X) = cot ne 1X (X >0)
8 31 T\vo pco ple agreed to meet at a certain place bet \veen 12 00
and 13 00 hours Each atrl,es at the meot1ng place Independently and

y !J
It-----

(A Y)

0 t f t
f z 0 , z
{a (6)
Ftg 8 31 Flg 8 32

wtth a constant probabl11ty donsLty 'lt any moment of the asstgned


ttme Interval The first to arrPle ,vatts for the other person F1nd the
prob abJll t} dlSttlbU tlOU funct lOU 0 f the \Val ttng t I m9 and the probabth ty
that the first \Vlll \val't no less than half an hour
Soluhon \Ve des1gnatc the arr1val tunes of the two people as T1
and T 2 and take 12 00 as our reference t1m~ Then each of the tndep~nd
~nt random vartables T1 and T2 Is dlslrtbuted \Vtth a constant density
tn the tnterval (0 1} A random vartahle T 1S the watttng t1me T =
l T1 - T, J
Let us flnd the d1strtbut1on functiOn G (t) of th1s vartable \Ve tsolate
on the p1ane t 10t 2 a domatn D (t) In 'vhxch J t 1 - t 1 I < t (the hatched
doma1n 1n F1g 8 31) In th1s case the d1str1button functton G (t) ts
equal to the area of tbts doma1n G (t) = 1 - {1 - t),. - t (2 - t)
whence \Ve have g {t) = 2 (1 - t) for 0 < t < 1
P{T > 1/2) = 1 - G (t/2) = 0 25
8 32 A random pulnt (X Y} 1s un,formly dlsttlbut~d tn a square
K With untt stdes (Fig 8 32a) Ftnd the d1str1button of the area S of the
rectangle R With stdes X and Y
Sotuhon \Ve 1solate on the z, y plane a domatn D (s) \Vtthln 'vh1ch
xy < s (F1g 8 32b) In this case the dislrlbuttotl function IS equal to the
Ch. 8. Distributions of Functions of Random Variables 231

area of the domain D (s):


1 1
G (s) = 1 - J Jdx d y = 1- Jdx ~ dy = s (1 -1 n s).
D(s) s sfx

Hence g (s) = G' (s) = -ln s for 0 < s < 1.


8.33. A random variable X has a normal distribution with pa-
rameters m = 0, cr. Find the distribution of the inverse variable Y =
1/X.
Solution. y = cp (x) = 1/x; the inverse function is single-valued:
x = 1/y. By the general rule
1
1
g (y) = a ]1"2n exp
(
- 2y2 1 ) 1
y2 or g (y) = 1
ay 1 2n
2
- exp ( - 2 cr!y2) ·
For y = 0 the density function g"(y) has a discontinuity of the 2nd
kind (see Fig. 8. 33). ·
It is interesting to note that the random variable Y does not have
a mean value since the corresponding integral diverges.
8.34. A system of random variables (X, Y) has a joint density function
I (x, y). Find the density function g (z) of their ratio Z = YIX.

g(f!) !!
!f/X=Z

o Y,
Fig. 8.33 Fig. 8.34

Solution. We specify a certain value of z and construct, on the x, y-


plane, a domain D (z), where ylx < z (the hatched area in Fig. 8.34).
The distribution function
0 00 00 zx
G (z) = J1f (x, y) dx dy = Jdx Jf (x, y) dy + Jdx Jf (x, y) dy.
D {~) -oo zx 0 - oo

Differentiating with respect to z, we have


0
1 1
00

g (:::) = - :t.j(x, zx) ax+ xf (x, zx) clx.


-oo 0
232 Applied Problem.r tn Probability Theory

lf the tandom var1ables X and Y are mutually 1nde.pendentt then


0 ~

g (z) = - j J
:t/dz) / 2 (z~) dx+ zjt(x) / 2 (zx) d:r.
-~ 0
8 35 Ftnd the distribution of the ratio Z = YIX of t~o Independent
'\\~1m all~ u~s\t1.hut~d tand\)m v at\~b\~s X ann Y "1.\h thaiat.\~t\$\\t~
mx = mv = 0, Ox, f1u
Solution. \Ve first cons1der a spec1al c.ase o~ = uu = 1 On the bas\S
of the preeed1ng problem
0
~ f
1
-2 (~l+zo!:tl) r
01'
j
I
-2 (~Z+z2x2)
y(z)=- J x . .2n e d.x+ J .x 2.-n. e dx
-~ 0
:r2
t r
QO
--2 u+tt) t
= :t le z dx == n (t + 11) (Cauchy's d1s tr1bu tton}.
0
In the general case the rat1o Z = XIY can be represented as Z =
(Y 1 av)I{X 1 cr~)~ 'vhere the vartables X1 === Xfcr~ and Y1 Yia 11 ba'e =
a normal d1strihut1on 'v1th variance un1ty,
Y therefore,
g (z) _ t ax
- n [1 + (O'.x.:}~a;-•J a11
In particular, tf a:Jt =a 11
f then
g (~)=In ( 1 + .zZ)J-l
8.3G. A candom po1:nt (X t Y) 1s untfor1llly
Fig 8 36 distributed tn a circle K of radtus 1 F1nd the
dlstr1button of the random var1able Z YIX =
Solution. In th1s case G (z) 1s the relative area of the domaJn D (z)
(F1g 8 36)
G(z}=! (arctanz+ ~-)
\vhence g (z) = G# (z) ~ (n (1 + z2 )}-1 (Cauchyfs dtstrihutton)
8.37 .. Erlang's dtstributton af order 2 Form a convolution of t\\O
I exponentral dtstrtbutlons wtth parameter A1 1 e find the dtstrtbutJOD
of the sum of t\vo Independent random 'arJables X 1 and X 2 ~h1cb
have pr oba btlJ t y denst ties
It (xl} = Ae- 1
~1 (.xl > 0), I2 (.r2)-== Ae-ut (x1 > 0)
Solution By the general formula (8 0 5) for the convolution of d1s-
tr1buttons '' e have
Co

g (z) = J fl (xt)fz (z- Xt) d:tt.


Ch. 8. Distri bution s of Funct ions of Rando m Variables 23:5

Since the functions f1 and / 2 are zero for negat ive value s of the argu-
ments, the integ ral assumes the form

If
% ~

g (z) = f
1 (x 1) 2 (z- x 1) dx1 = '),} ) e-1..x1e-Mz-x1> dx 1 = A,Zze-1..z (z > 0)~
0 0
(8.37}

The distri butio n with a proba bility densi ty (8.37) is know n as Erlang's-
distribution of order 2. It origin ates as follows. Assume that there is an
elementary flow of event s with inten sity A. on the t-axis , and only every
other point (event) is retain ed in this flow, the interm ediat e point s
being deleted. Then the interv al betwe en adjac ent event s in the so rare-
fied flow has an Erlan g order 2 distri butio n.
8.38. Erlang's distribution of order n. Form a comp ositio n of n expon en-
tial distri butio ns with param eter A., i.e. the distri butio n of the sum of n:

X A.___ __ _~nth point


e; :y.=.! It; v----/'-: :.1>---
II

Xi
0

Xn
"'"'
x (a)
x+dx
l
0
• • 0 0 ••
(6)
Fig. 8.38

i~dependent rando m varia bles X 1 , X 2 , • • • , Xn which have an expon en-


tial distri butio n with param eter 'A.
Solution. We could solve the probl em by conse cutiv ely finding the-
~on:olutions of two (see Probl em 8.37), three , etc. distri butio ns, but it
IS Simpler to solve the probl em proce eding from the eleme ntary flow,
retaining every nth point in it and deleti ng the interm ediat e point s
n
{Fig. 8.38a). X = ~ X il where X f is a rando m varia ble which has an
i=l
exponential distri butio n. We find the proba bility densi ty fn (x) of the
ran.do.m varia ble X, first finding the eleme nt of proba bility fn (x) dx.
Th1s 1s the proba bility that the rando m varia ble X falls in the eleme n-
tary interv al (x, x dx). +
For X to fall in that interv al, it is necessary that exact ly n _ 1
events fall in the interv al x and one event in the interv al dx (Fia. 8.38b)
!he proba bility of this occurrence is In (x) dx = P 11 _ 1A. dx, where p ·
IS the proba bility that n - 1 event s fall in the interv al x. But the nu~:
her of events of the elementar~' flo·w fallin g on the interv al x has a Pois-
23-t Appl~ed Probltm.s ~n Probablltly ThefJry

~on dtstr•button '' tth parameler a = 'Ax., and hence


(1.%')11•1
In (x) dx = (n -t)l Ae-lx d.x,
~(A.r)"-1
fn (x) = (n-t)l e-u (X> 0}. (8 38 t}

1M d \~\.t 1h'U\. \-0-r.. ""\t \~ \)t~lYnh \1\ t y d~-ns. \.t y {& 3S 1) \~ k n(\w n a~ Er l<t.ng ~
dfstribut1on of order n
\Ve could seek the dtstr1but1on functton Gn (.x) for Erlang's d13tt1.b
utton of order n by Integra ttng (8 38 1) from 0 to x, but 1t IS stmp1er to
.der1 ve 1t us1ng the elementary flo\V agatn
Gn (x) = P {X < .z} = 1 - P {X > :t}
For \he event {X > x) to occur, 1t Is necessary that no moro than
n - 1 events fall on the 1nterval X 1 1 e. 0. 1. 2, , n - 1 events,
l1ence
n-1
G,. (:r) = 1- 2 (';:t e-u (.t > 0}.

This express1on can be reduced to the tabulated functton R (m 1 a)


{see Appendtx 2)
Gn (x) = i - R {n - 1, A.x)
8 39 Erlangts generalized d~str1bUt1on Form the convolutlon of two
exponent1al dtstrJbuttons \Vlth dtfferent parameters
/ 1 (x1) = A1e-Atsl (x1 > 0), / 2 (z2) = A2e-At:.:s (:r2 > O) .
Solution \V'e designate X= XI + x2, ,vhere Xt and have the x,.
dtstributtons f 1.. (x t.l t f" (x,.)
In accordance '\ Jth the general formula for a convolution of distrt-
buttons
OQ

g(x) = J fd:rt) fz (x-rl) d.t1


But 1n th1s case both dtslrlbUtions are nonzero only for a pos1ttve value
-of the argument. and th1s means that J1 (x1) = 0 for x1 < 0 and
/ 2 (x - x 1 ) ==: 0 for x 1 > x For x > 0
J
Ch. 8. Distribution s of Functions of Random Variables 235

bution:
g (x) = A.2xe-A:c (x > 0).
Remark. We can prove by induction that the distribution of the sum of n inde-
pendent random variables X 1 , • • • , Xn, which have exponential distribution s with
different parameters At• ... , An, i.e. Erlang's generalized distribution of order n
has a probability density
n n
e -Ajx
gn (x)=(-1)n -l IT
'),i ~ -n~--- (x > 0).
i=1 i=i IT <"' j _ '·k)
k=Fi
n
(The notation f1 signifies that we take the product of all binomtals of the
k""='
. 1
form 'i.J-AIJ. for k=1, 2, ... , j-1, j+1, ... , n, i.e. all except for l.j-l.h)· In
a special case, when J,.i = (),,
n
gn (x) = 2j (-1)i-l C~l.ie-Ar..:.
i=1

The distribution function of Erlang's generalized distribution of order n has


the form
n n
1-e -i.l'.:
Gn (x)=(-1)n -I IT '),.i ~ _ _n_ _ __ (x > 0).
i=1 j=i l.j Il (l.j-l•R.)
k=Fi

n •
Gn(x)= 2j (-i)HC~[1-e-h:c) (x>O).
i=1
If /.1 = ~ = ... = 1. 11 = 1., we get Erlang's distribution of order n:
') {f..x)n-1
g 11 (x)= (n- 1 )! e-i.:c=/.P( n-1, f.x) (x>O),
'
~ ~

Gn(x) = JA.P(n-1, /,z)dx=1- J/.P(n-1, /.x)dx=i- R(n-1, A.x) (x>O),


0 :c
where
m
P(m, a)=~~ e-a, R (m, a)=~ ~~ e-a (Appendices 1, 2).
k=O

8.40. The distributio n of the maximal of two random variables. Given


two random Yariables X and Y with a joint probabilit y density f (x y)
find the distributio n function G (z) and the probabilit y density functio~
g (z) of.the maximal of these two variables Z = max{X, Y} .
. s.oluhon. We shall seek the distributio n function of the random
'anable Z: G (z} = P{Z < z}. For the maximal of the variables X
236 Applted Probltmr In Pro bab fllly Theorv

an dY to be sm all er tha n z, eac h of the var iab les mu st be smaller than z


G( z) =P {X <: , Y< z} =F (z , z),
where
v
r)
X'

F( x, y)= : f(x , y}d :rd y.


-oo -oo
Th us,
t %

G (z)""' J Jf (x, y) dx dy.


-oo -oo

To find the pro bab 1lt ty de nsi ty fun cti on g (z), we differentiate G (:)
w1th respect to z wh1ch app ear s tn the ltm 1ts of 1nt egr at1 on of the
do ub le Int egr al \Ve dif fer ent iat e G (z) as a com pos 1te function of the
tw o var 1ab les z1 and Zt, eac h of which depends on z (z = z, z, = :)
1

(z) == + 8G (J)
g dG (1)
dz
= d {
d:
-oo
[
-oo
J}
f(x , y) d y dz =
8G (z)
az.
d% 1
dJ oz.
d:,
d'
~ :
= Jf (z, y) dy + ) f (x, z) dz
-oo -oo
In the spe cia l cas e wb en X and Y are tnd epe nd en t, I (x, y) =
It (x) ! 2 (y) we have
z z
g (z) = /i (z) ) / 1 (y) dy
-co
+ /1 (z) Jfl (x) dx,
-oo
or, 1n a mo re concise form 1
g (z) = / 1 (z) F 2 (z) + /2 (z) F1 (z)
If the ran do m var iab les X and Y are 1n dep end ent and hav e the same
dis tri bu tio n, the n / 1 (x) = f 2 (x) = f (x) and g (z) = 2! (z) F (z)
8 41 The dtstr~butzon of the ml nim al of tu,o random tar lab les A sys tem
of t'vo ran do m var iab les (X , Y) has a JOint pro ba bil ity den sit y f (x, y)
Ftn d the dts trt bu tto n fun ct1 on G (u) and the pro bab 1l1 ty den sit y func
t1on g (u) of the minimal of the se var 1ab les U = m1n {X, y}
So lut ion \Ve see k the complement of the dts trib Ut ion funct1on "1t b
respect to un1 ty
1 - G (u) = P (U >u ) = P{ X >u t Y> u}
ThlS 1s the pro ba btl tty tha t the ran do m po tnt (X Y) falls 1n the domain
D (u) hat che d 1n F1g 8 41 It IS ev ide nt tha t 1- G (u) =1 - F (u, oo )-
F ( oo, u) + F (u,. u), wh enc e G (u) == F (u, oo) + F (eo . u) -
F (u. u) = F 1 (u) + Ft (u) - F (u, u) D1fierent1at•ng w1th respect to
Ch. 8. Distribution s of Functions of Random 'Variables 237

u, we have (see Problem 8.40)


u u
g(u)=/1 (u)+/2 (u)-) f(u, y)dy- Jf(x, u)dx.
-co -co

When the variables X and Y are mutually independe nt, we have


g (u) = / 1 (u) [1 - F2 (u)] + /2 (u) [1 - F1 (u)] .
.If the random variables X and Y are mutually independe nt and have
the same distributio n, then f 1 (x) = f 2 (x)=
j (x) and g (u) = 2f (u) [1 - F (u)l.
8.42. The distribution of the maximal and the !!
minimal of several random variables. Given n
independent random variables X 11 X 2 , • • • , Xn
distributed with probabilit y densities / 1 (x1), u
} 2 {x 2), • • • , fn (xn), find the probabilit y density
function of the maximal of them Z =
and the minimal of them 0
max{X 1 , X 2 , • • • , Xn}
U = min{X 1 , X 2 , • • • • • • , Xn}, i.e. of the ran-
<lom variable which assumes a maximum (mini- Fig. 8.41
mum) value as a result of an experimen t.
Solution. We designate the distributio n function of the variable z
11s Gz (z). We have
n
Gz(z)=P {Z<z}= IT
i=i
Ft(z),
z
where F, (z) = ~ f 1 (x 1) dx1 (i = 1, 2, ••. , n).
-co

Differentiating, we get a sum whose every term results from the


multiplica tion of the derivative of one of the distributio n functions
Fl (x), F 2 (x), ... , Fn (xn) by the product of the other distributio n
functions. The result can be written in the form
n fJ(Z) n
gz (z) = ~ F (z)
1
IT F,(z).
j=1 1=1

By analogy, designatin g the distributio n function of the variable U


as Gu (u), we obtain
n
Gu (u) = 1- TI [1-F1 (u)].
i=i
Differentiating, we get
n
~ fJ (u)
gu(u)= LJ -:-1--=-=-F,--:-(u-:--) IT [1-Fi {u)].
1=1 i=1
238 Appl,ed Probltml ln Probability Thtory

8 43 There 1.re n Independent random \ariables xl, X s t xit

'\\ hrch hn' e the same di'=trtbution wtth probabtl1ty denstty I (x) F1nd
the dtstrihut1on of the maxtmal of them Z = max {Xi, X 1 • f Xn}
and the mtntmal of them U = m1n{X1 • X2, , Xn}
Solution On the basts of the solution of the preceding problem
G, (z) = (F (z))n, g ~ (z) = n (F (z))11 -t I (z),
Gu (u) = 1 -If - F (u)]n, Ku (u) === n [1 - F (u)Jn-if (u)
8.-44 Three Jndependent shells are fired at an :c, y plane The centre
of seatter1ng co1nctdes '"'I th the ot1g1n, the scattering ts normal and
ctrcular ax = a 11 ==== a Tl•o h1\ nearest to the centre of scatter1ng is
1

chosen F1nd the probabxltty denstty function g (r) of the distance R


from the nearest h1t to the centre
SolutJon \Ve haveR= m1n{R 11 R 2 , R 3 ) From the solution of the
preced1ng problem 1t folio" s that g (r) = 3 [1 - F (r}l 2 f (r) .. ,vhere
F (r) and f (r) are the distrtbutton functton and the probabtl1ty dens1ty
function res pee t1 \ e 1} of the d 1st a nee from an} ht t to the centre of
scattertng
F (r)= i-exp [ - :0~]
f (r) = ;, exp ( - ;;, ] (r > 0)
Henco
g (r) = !~ exp [ - ~:: ] (r > 0)

1 e the prohabtl•ty denstty of the distance from the nearest btt to the
centre of scattering has the same form as that for each of them the only
condttlon being that the parameter a 1s decreased ']13 t1me.s, 1 e re-
placed by o = cr/3
8 45 F1nd the dtstrtbut1on g tL (u) of the m1ntmal of t\\ o 1ndependent
random 'ar1abl~s T1 and T 2 wh1ch ha\ e an exponential dtstrJhuttons
ft. (tt) = A. 1e-Al't (t 1 > O)t f 2 (t2) === A2e-lsit (t 2 > 0)
Solution On the basis of the solutton of Problem 8 42
Cu (u) =It (u) 11- F 2 (u)J +/
2 (u) [1- F 1 (u)J
= llo-ltue-11U.+ Aae-)..~ue-l(u =(.AI+ A~) e-(A-t+"-I)U
1 -e the cl1str1but1on of the mtntmal of tv, o 1ndependent tandGm va.r1
abies "\\}nch have an cxponentu1l distributions IS also an exponential
d1strihut1on 'vhose parameter ts equal to the sum of the parameters of
the Initial distribUtions
\Ve can arr1' e at th1s conclusion much easter 1f we use the concept
of a flo'v o£ events Assume that there 1s an elementaty flo'v wxth 1nten
s1ty A.1 on the t 1 a.xts and an elementary flow" 1th 1ntenstty At on the
t 2 a.x1s \Ve br1ng these two flows together on tl1o t a:xts, 1 e we super-
Ch. 8. Distributions of Functions of Random Variables 23\Jl

impose them. It is easy to verify that the result of the superposition


of two elementary flows is also elementary (the properties of stationari-
ness, ordinariness and the absence of aftereffects are retained). The
intensity of the combined flow is lv1 + A2 • The distribution of the
distance from a given point to the nearest event of the flow is exponen-
tial with parameter lv1 +
A2 •
The same is evidently true for the distribution of the minimal of
any number of n independent random variables which have exponential
distributions with parameters lv1 , lv 2 , • • • , An. It is exponential with
n
parameter ~ lv 1•
i=1
8.46. From the conditions of the preceding problem find the distrib-
ution gz (z) of the maximal of the variables T 1 , T 2 •
Solution.
gz (z) = ldz) F 2 (z) + 12 (z) F 1 (z) = .A1e-'-•z [1-e-'·2z] +A 2e-?.2z P- e-'-• 1 }
+ +
= A,1e-'-•z A2e-'-2z _ (A, 1 A, 2) e-<'-•+'-2)z (z > 0).

This is not an exponential distribution. For /, 1 =A,= A


-
gz (z) = 2/ve-'-z (1- e-z'-) (z > 0) •
. 8.47*. A random variable X which has a probability density I (x)
1s subjected to n independent trials. The results of the trials are arranged
in increasing order. A series of random variables Z1 , Z 2 , • •• , Z 10 • • • , Zn
results. Consider the kth of them, Zk. Find its distribution function
G11 (z) and probability density function gk (z).
Solution. Gn (z) = P {Zn < z }. For the kth (in an increasing order}
of the random variables Z1 , z~, ... , Z 111 ••• , Zn to be smaller than z
it is necessary that at least k- of them be smaller than z: '
n
Gk (z) = Lj Pm,
m=k
wh~re Pm is the probability that exactly m of the values of the random
v_a~Iab}e X in n trials are smaller than z. By the theorem on the repe-
titiOn of trials

whence
n
G,(z)= Lj C~[F(z)]m[1-F(z)]n-m,
m=k
'
where F (z) = J1(x) dx.
.:240 Ap plled Pra ble m.1 ln. Pro babtlitgo T lc.e.ory

The probabtltty denstty gA (z) can be found by differenttat1ng thts


e~press1on and tak1ng 1nto account that
C"' nf ~-1
nm= (m-tH (n-m}l =n n.-t'
C:'{n-m) =n~_ 1 (m<n).
Alter some s1rople transformations we obta1n
g 11 {z} = nC~: t1 (z) {F (t)]"- 1 [ 1- F {z)l"-'.
It 1S much eas 1er, however, to obtain g" (z) d 1rectly, ustng the follow-
Ing simple reasontng The element of probabtl1ty K~t (z) dz 1! approxtm
ately the probab1l1ty that the random var1able Zk (the kth largest value,
of the variable X) falls on the Interval (z, z +
dz) For th1s to occur.
1t 1s necessary that the follO\VIng events are superimposed
(1) one of the values of the random vartabte X falls on the Interval
(z, z + dz)
(2) (I.. - 1) other 'a lues prove to be smaller than z,
(3) (n - h.,) remarntng values prove to he larger than z (we neglect
the probabtltty that ntore than one value fall on the elementary tnterva1
(z, z + dz)]
The probabtlity that every such combtnat1on of events occurs IS
f (z) d2: {F (z)l.t: 1 t1 - F (z}]n.-.t~ The number of eomb1nattons lS equal
to the product of the number n of ways tn 'vhrch we can choose one value
out of the n values to place 1t on the Interval (z, z + dz) by the number
c!:l of ways tn 'vhleh 've tan choose k- i values out of the rematn1og
n - 1 values to place them to the left of z Consequently,
g11 (.z) dz = nC!: ft (z) IF (.z)]"-1 {1- F (z)]"-k dz,
whence
Ktt (z} = nC!-1! (z) CF (z) ]-'-I (1- F (z)Jn-k
8 48 There are four regulators {thermocoup.les) 1.n an electrtc furnace
each of whtch shows the temperature ,v1th an error norm 'llly dtstrJbuted
\VIth zero mean value and the m~an square davJatlon The furnace IS
heated At the moment when t\Vo of the four thermocouples show the
a,
temperature not lower than the crtt1cal te-mperature 'T~h the furnace IS
automatically switched off Find the probab1l1ty dens1ty of the tem-
perature Z at 'vhtch the furnace will be s\v1tched off
Solut1on The tempe1atme Z at wh1ch the iurnace 1s sw1tched off
lS the second smallest (r e the tbtrd largest) of the four values of the
random vartable T wh1ch has a normal dlstrihutlon -w1th the centre of
scattering To and the mean square deviation O"t
f (t-T0 }1
f(t)= y~ exp- 20 1t
a1 2rt
'The eo-rrespond1ng d1str1bution iunct1on
F (t) =dl ( t;,'fo )+0 5
Ch. 8. Distribu tions of Functio ns of Random l'ariable s 241

Using the results of the precedi ng problem for n = 4, k = 3, we


obtain
g3 (t) = 12- exp
Ot l12n
(t-:o)2 r-t-'to )
2at
J[<D ( at

+0.5 ]
2
[ 0.5-<D ( t~'to ) J.
8.49. There are n indepe ndent random variabl es X 1 , X 2 , ••• , Xn,
whose distribu tion functio ns are power functio ns
0 for xi ~0,
x~i 0 <xi~ 1

!
for (i = 1, 2, ... , n)
1 for xi>1,
where k; is a positiv e integer . The value of each random variabl e is
considered and the maxim um value Z is chosen . Find the distrib ution
G (z) of that random variabl e.
Solution. On the basis of the solutio n of Proble m 8.42
i n
G(z)= TI F;(z)= TI zki for 0<z~1,
i=i i=i

or, if we designa te
0 for z~O,
G (z) = zk for 0< z~'i,
1 for z> 1,
i.e. the maxim al of several random variabl es, which are distrib uted
according to a power law in the interva l (0, 1), is also distrib uted accor-
ding to a power law with an expone nt equal to the sum of the expone nts
of separat e distrib utions .
. 8.50. The discret e random variabl es X 1 , X 2 • • • • , Xn are mutual ly
mdepen dent and have a Poisson distrib ution with parame ters a 1 ,
n
az . ... , an. Show that their sum Y = ~ X; also has a Poisson
i=i
n
<li!>tribution with parame ter a= 2j a;.
i=i
. Solution. We fir.st pro\'e that the sum of two random variabl es
·\ nnu X 2 has a Poisson distrib ution. for which purpos e we find
the probab ility that X 1 +X2 =m (m=O, 1, 2, ... ).
m

P{X 1 -TX 2 =m}=~ P{X1 =k}P{ X2 =m-k }


k=O
m-h
e-al
a.
- e-a2,
(m-k)!
242 Applied Problems tn Probabzllty Theory

ml k
Tak1ng Into account t h at Cm = (kl) (m -k)l .. '" e reprec;:ent the ex pre~
s1on tn the form

and th1s 1s a Po1~son dJstrihutton \Vtth paramoter a 1 ct1 +


\Ve have thus proved that the sum of two 1ndepe-nuent random '\Frtla·
bles, 'vh1ch ha' e a Poisson dtstributtont also has a Potsson dtstr1but1on .
Th1s result can be extended to any number of terms by 1nductton
8 51 A S) stem of random 'artnbles (X, }'") 1s normally d1str1buted
\Vlth characterJStlcS mx. my, Ox, a 11 and Zxu Random var1ables (Ut v)
are related to (.1, Y) as [?=a}( +bY+ c and l' = kX + lY m +
F1nd the dtstrJbutton of the system of random 'ar1ables (U" l')
Answer. The S}stem ( L 't l) ts normally d1strtbuted "•th characteris--
tlcs
mu = amx + bmu + c~ mo = km% + lmv +m,
a~ = V aza; + b 2 o~ + 2aba:tayr :tll~
o~ ~ y /. ui + l a~ + 2hlO'.rO'ur xv
2 2

aka]:.+ bla~ + {bk +ai) a xcr11 rxg


.Zuu = au au •
8 52. Form the con\ olutton of t\vo binOmial distribUtions with para
meter6 (nt p) and (k, p)
Solution Assume that X ts the number of occurrences of an event A
1n n 1ndependent trials 10 each of ,vhtch 1t occurs 'Vl th probabiltty p,
X has a htnom1al distribution 'v1th parameters (nt p) Stmtlarly t Y ts
the number of occurrence'; of the e\ent A 1n k tndependent trials under
the same cond1t1on.s 1t h1.s a binomial d1strtbut1on 'v1th parameters
(~, p) Z = X + Y IS the number of occurrences of the event A 1n a
serres of n + k trtals \Vtth pro bab1l1.ty p of the e' ent A on each tt1al,
the variable z also has a btn om1 al d1strthut 1on '\VI th parameters
(n + k, p)
Note that \Vhereas the probab1ltt es p are dlfferent tn dtfferent series,
the convolution of two binomial dtstr1bUt1ons results In a nonbino:m.Ial
d 1str1 but1on
8 53. Ustng Cheby~hev,s tnequal1ty, est1mate from above the pro-
bahillty that a random variable Xt hav1ng a mean '\alue m and a mean
square dev1at1on o, deviates from m by less than 3a
Solution~~ Chebyshev's tnequaltt} {8 0 18) } 1elds

P{ X D lXI o! i } 8
I -ml#3a}~ (aa) 2 = (Sal 2 =g-, P{rX-ml <3a ~9·
Ch. 8. Distribution s of Functions of Random Variables 243

Thus any random variable deviates from its mean value by less than 3cr
with probability not less than 819*>.
8.54. A large number n of independe nt trials are made, in each of
which a random variable X is uniformly distribute d on the interval
(1, 2). We consider the arithmetic mean of the observed-v alues of the
n
random variable X: Y = ..!.n h Xi. Proceeding from the law of large
i=1
numbers, find the number a to which the variable Y will tend (converge
in probability ) as n -+ oo. Estimate the maximum practically possible
error of the approxima te equality Y ~a.
Solution.
n
a=M[Y1=__!__"'
n Ll
JI!I(Xd=_n!_·n·1.5= L5,
i=l
n

~-·
\ Tar[Y]= ....:1::.._ ~ Var[Xd= n 1 1
11 2 n2 12 - 12n ' crv=lfVa r[YJ=
21 3n
i=i

The maximum possible value of the error is 3cr y·


8.55. We consider a sequence of n random variables Xi, X 2 , ••• , Xn
uniformly distributed on the intervals (IJ, 1), (0, 2), ... , (0, n) re-
n
spectively. What happens to their arithmetic mean y =+ h xi
i=i
when

n increases?
Solution. For a given n
n
M [Y)=+ h M IXd= -k- (-}+f+ f+ ... +"]-)
i=1

1 n(n+1) __n_+~1
-2n 2 4 •

As ll--+ oo, M [Y] increases indefinitel y and there is no stability


of the arithmetic mean.
8.56. :S.andom variables Xi, X 2 , ••• , Xn are uniformly distribute d
on the mtervals ( -1, 1). ( - 2, 2), ... , ( -n, n). Will the arith-
n
metic mean of the random variables X 1, i.e. Y =...!... LJ
I!
'V X·, converge to
I

. i=l
zero m probability with an increase of n?

*) . h
Th. 1s.
1s t e extreme, most unfavourabl e case. For random variables usually
F~~o~ntered m practical applic.atio?s t_hi~ proh~bility is much closer to unity.
u . 1 ~stance, for n normal distribUtion It IS 0.99,; for a uniform distribution it is
1

lit ~ • and I or an exponential distribution it is 0.982.


16*
244 1v !1 sed Pro b! r m 1 1n Pro ba bllit y T heo r11

Solution ~~ (J J = ~ ~ 'J f..\ 1} = 0 but the random vartable Y


i:uu 1
"1ll not c. on' erge to zero 1n prohabl11ty s1nce the cond1 t1ons of 1\farkov s
theorem are v1olat~d the 'ar1ancec; of the random 'ar1ables A., are
not l1m1ted bl t ht:' ~arne nurnber L hut 1ncrease Indefinitel y \\ 1th an
1 ncrea ~e 1n n
B 57 A computer produce~ random btnar) d1gtts such that the sym
bols 0 and 1 rna} 1ppc1r '' 1th equn 1 probab1ltt ) at each poSition and
1ndependentl~ of other po.s1t1onq The sequenre of S) mbols 1s dt' 1ded
tnto groups conststtng of the satne s~ mbol~ say 000 111 0 1
00 1111 00 11 0000 1 0 The number of ~vmbols 1n e-1ch group
1s calculated and 1~ d 1" 1ded h) the ntun ber of groups Ho\\ ""Ill the
qu-ottent beha\ e \\hen the number of groups n 1ncrea~'t!s lndefinttely?
Solut1on \ random 'artab le X 1 the number of s~ mbols ln the ~th
g1oup ha-;. , geometric dtstrtbuti on begu1n1ng "1th unrty (~ce C}l(IP
ter 4} ~~ \Xl)--1/p === 2 \ ~r {X;}= glp~ =0 r:i/0 5-z. =2
n

If Y .. =-1t ~ \:,
-J
then

1\1 {1"" n l - _..!._


n
n2 = 2

On the basls of the Ia" of large numbers the 'artable Yn con\ erges
\n proba b1l1 t) \o '\1 {} l - 2 a~ n --+ oo 1tm P { ~ }:--r - 2 1 c} = 1 <
n-OQ
B} the three Sigma rul! the error of the 1.ppro'\ 1mate eqnalit) YJ'J ~ 2
does not ext-eed 3 V21n
8 58 A factory shop produces balls for ball bear1ngs The shop
manufactu res n = 10 000 h1lls per sh ft The probabJlit ) th 1.t a ball
'' 111 he deiectl\ ~ lS 0 o~ The cause~ of defects are l ndepen dent iot
separate: balls The read~ balls 1re Inspected Irllmedtatel~ after their
manufactu re The defect1ve bal1s are reJected and dumped 1nto a b1n
''h1le the ~ound balls "lre sent to the a~sembltng shop F1nd tbe number
of balls for '" htc~ the b1n must be designed so that 1t 15 not overfilled
dur1ng a ~h1ft "1 th prohab1h ty 0 99
Solution The nutnbQor of reJected balls X h tS a blnomtd.l d1strtbut1o n
s1nce n IS large '' e can 'l~sume on the bas1~ of the Laplace l1m1t theorem
that the dtstrib u t 10 n ts apprO"\: tma tel} 11ormal \\ 1th character1 s ttcs
mx ~ up - 10 O'JO 0 OS = 500 Varx = npq == 500 A 0 95 = 475
Ox~218
'\Te find the \alue of l suc.n that P { Y < l} ~ 0 99 or
<D ( l-mx) +O 5=<11 ( l-500)
O;r 21 8
+ 0 5 =0 99

From the tables for the funct1on Q'J (x) \Ve find (l - 500))21 8 ~ 2 33
'vhence l ~ 551 1 e the b1n, designed for appro"ttmatel~ 550 halls
''Ill not be o' erfilled durtng a shift '' tth prohabtl1t y 0 99
Ch. 8. Distribu tions of Functio ns of Random Variable s 245

8.59. There is a queue of n = 60 people to the cashier , the payme nt


to each person being a random variabl e. The mean payme nt mx =
50 roubles, the mean square deviati on of the payme nt vx = 20 roubles .
The paymen ts are indepe ndent. (1) How much money must the cashier
have so that all 60 people get their money with probab ility 0.95.
(2} How much money b will remain with the same probab ility 0.95,
after the cashier pays all GO people, if he starts with 3500 roubles ?
GO
Solution. (1) The total payme nt Y = ~ X;, where Xi is the pay-
i=t
ment to the ith person. On the basis of the central limit theorem for
uniformly distrib uted terms, Y has an approx imately normal distrib -
ution with parame ters my = 3000 roubles , v y = V
60 Y 20 :::::::; 154.8 rou-
bles. We find the necessa ry initial reserve of money l from the conditi on
cD [(l- my)layl +0.5 = 0.95. By the tables for the error functio n
(Appendix 5), (l- my)lvv :::::::; 1.65; l =my + 1.65 vy :::::::;3256 roubles .
(2) The money b that remain s with probab ility 0.95, can be obtaine d
by subtrac ting the sum l found in (1 ), i.e. b = 3500 - 3256 = 244 rou-
bles, from 3500.
8.60*. A lottery is organiz ed as follows . The partici pants buy tickets
with tables of numbe rs 1, 2, ... , 90. A partici pant must choose five
numbers at random , mark them on his ticket and send the ticket to the
organizers who store all the tickets till the day of the drawin g.
On an assigne d day five differe nt numbe rs are drawn at random from
90 number s and the numbe rs drawn are reporte d to the partici pants.
If a particip ant guessed less than two numbe rs (0 or 1), he gets no prize.
If two of his numbe rs coincid e with two of those drawn, he wins a rouble;
if he guessed three numbe rs, he gets 100 roubles , if he guessed four
numbers, he gets 10 000 roubles , and if he guessed all five numbe rs,
he gets 1 000 000 roubles . Find: (1) the lowest ticket price at which on
the average the lottery organiz ers will not lose; (2) the average income
M of the lottery organiz ers when 1 000 000 people partici pate in the
~ottery, each choosin g numbe rs indepe ndently and each buying one
30-kopeck ticket; (3) using the three-s igma rule, find the bounda ries of
the practic ally possibl e payme nts in the lottery ; can we conside r the
to~al paymen t in the lottery to have an approx imately normal distrib -
ulwn?
Solutio n. (1) We designa te as ]J; the probab ility that exactly i of the
fi:e number s chosen by a partici pant coincid e with the drawn numbe rs.
\\ !' find that
3C2
2. 25, ,. -. . Io-~. . , Pa = CC"
P2 = cgq.
c., ·' ~ 5 8'
" :::::::; 8 • 12 X 1Q-4,
90 90

P~ = ctn.,
c~'" :::::::; g· G-' :< 10- •
6
P5
1 '> 28 x 10-s.
= cs00 : : : :; -·
90

The _minimum price of a ticket must be equal to the mean value of


\h~- pnze r ceh-ed by the partici pant who bought the ticket: m =
+
-·-;:) )< 10-!! 8.12 X 10-4 -< 102 +
9.67 >~ 10-6 X 104 2.28 >~ +
246 Applied Probl~ms 1n Ptobabilltu Thtory

tO a r< 106 = 22 3 I0- 2 rouble~~ 1 e the Ininnnum prtce of a t1clet


ts about 23 J... o pee J. . s
(2) JJ = (0 30 - U 2~3) A tOt = 77 x 10'3 roubles
(3) Tho total '' tnn1ngs ~Y '' luch the organ1zers of the lottery must
l DOO OQO
pal Is the sunt of the ''Innings of the partictpan t~ X= ~ X1,
i=i
'' h.erL X~, l~ the amount ''on h) the a.th partic1pant
'' e assume that tht? p1rt1ctpants 1nark. thc1r numbers tndependentl~
.so that the 'artJ.bles .\. t (i == t ~ 2 , t 000 000) are 1ndependent
'' e kno" from thP centrJl lun1t theorem that the sum of a sufflcientl}
1arge n u mher of 1n de pendent rando rn ' art a bles '' h IC h have the same
dtstrlhUtlOJl has apprO'\.UUale}) a norma} distrJbUtlOn \\~e mUSt find
out "hether the ntunber of terms n = 1 000 000 rs suffictent 1n this.
case for the 'at1ahle A to be constd.ered normall~ dtslrtbuted
\\ e find the me1n 'al ue nl"' and the mean square devtat1on ax of the
r"tndom '~rtable ..\ Fot anl l = 1 2 , 1 000 000 '' e get m%i ==
22 3 X 10 .. = 0 221 r.t 2 [ A , ] -== 2 25 10 -~ +
8 12 ...L D 07 X 10~ -
2 28 / 10-' - 2 3~ 10" v nrxt == 2 38 ' tOt ~ 0 22 2 ;:::::: 2 38 X to~
l{enc-e

nz - 106
.)..

(J 'I! = 10 5 l/ 2 38 ;:::;: t 54 t 05 •
\Ve kno'' th 1t for the randont ''lrJahle A, '\ htch has a normal diS-
\r1bul1ont the Tang-e of pract u~all\ poss1ble '"alues 1s m'r ± 3a .:r In our
Caii:e the lOl\ er bound of the po~~ 1ble ' al ues of the randotn \ ar1able X,
pro'\ 1ded that 1 t " as norn1all) dIS tr tbu ted, \\ oul d be IJlx - 3a x =
~2 39 x 10~ The negatt' e 'alue of thts hound s1gn1fies that\\ e cannot
consider the randon1 'artable A. to be normallJ. dtstrtbuted Since there
cannot be negat1' e- '' 1Unkngs
to

8 6! •. F1nd tho Jun1 t ltnl ~ a~ e '\ ,\here a IS a pos1 t1 ve Integer


a-.oo m
m.~o
a
Solutwn. L :~ e-' 1s the probabthty that the rando1n ,anablt
rr=O
X '\ h1ch has a Po1'~·~on dl~tr\.button, ,, tll not e\.c-eed i.ts mean
\ a1ue a But as the param~ter a tends to tnfint t}, the Po1 c:son dts
trtbu t1on approache~ the norntal cltstrihutlon For the normal d1s ..
trtbu t1on, the probtbt h t' that the randon1 'artable "1ll not e'1(ceed
a
~ am ~ ~
1ts mean value 1s .112, and thrs means that l1m .LJ · , e- = 2
a:-oo 0 m
m~

8 62. Indo pendent randoin \arJables X 1 , X 2 , .. have the same


cx.ponent1al distrtbutton '' tth parameter 1- f (.r) = A.e-,._:r
Ch. 8. Distribution s of Functions of Random Variables 241

We consider the sum of a random number of these variables Z =


y
LJ Xi, where the random variable Y has a geometric distributio n
i=!
}Jeginning with unity:
Pn = P{Y = n} = pqn-l (0 < p< 1; n = 1, 2, ••. ).
Find the distributio n and the numerical characteri stics of the random
variable Z.
Solution. The sum of the fixed number n of random variables
11

~ X 1 has Erlang's distributio n of order n (see Problem 8.38) with


i=!
probability density
f (x)- I. (J..x)n-1 e-Ax (x > 0).
n - (n-1)!

The probabilit y density cp (z) of the random variable Z can be found


from the total probabilit y formula on the hypothese s {Y = n}:
00 00

'}, (/..z)n-1 -Az n-1


(n-1)! e pq
n=i n=l
00

= p'Ae-Az ~ O·:t)k = p'Ae-Aze'A.qz = p'Ae-APt (z > O),


k=O

i.e. the random variable Z also has an exponenti al distributio n, but


with parameter 'Ap. Consequen tly
mz = 1/('Ap), Varz = 1/('A2p 2 ).
8.63. The distributio n of the sum of a random, number of random terms.
We consider the sum o( a random number of random terms Z =
!I

~ X i• where X 1 , X2, ••• is a sequence of independe nt random variables


i== 1
:'·hich have the same distributio n with density f (x); Y is a positive
mteger-val ued variable, independe nt of them, which has a distributio n
P{Y = n} = Pn (n = 1, 2, ... , N). Find the distributio n and the
numerical characteri stics of the random variable Z.
Solution. We suppose that the random variable Y assumes the value
n (n = 1, 2, ... , N) with probabilit y Pn- On this hypothesis
n
= z
~ Xi. We designate the probabilit y density of the sum of n independ-
i==t
en~ random variables X 1 , X 2 , • • • , Xn, which have the same distrib-
Uhon, as j<n> (x). \~7 e can find the densities f< 11 >(x) successive ly: we first
find f< > (x), i.e. the conYolutio n of two similar distributio ns f (x) and
2

f (x), then} j< 3> (x), the conYolutio n of f< 2> (x) and f (x), and so on.
2..8 Applled Problems In ProbabllUy Theory

By the total probabtltty formula, the probab1ltt~ dens1ty q> (z) of


the random 'ar1able Z Is
N
fP (z) = }J fP") (z) P n (8 63)
n-1
\~" e found \he numer1c.a 1 char ac ter1st1cs uf the random variable Z
10 Problem 7 64
~~ fZ] = m~myt Var [ZI = Var~ m 11 + nr! Var,,
"here m;ct m u, Var x, V .ar11 are mean 'al u~s and 'ar1a nees of the random
'artables X and Y
'Vhen certatn condtttons are fulfilled} '' e can a~sume, \vtth an accur
acy sufficient for appltcat1ons that the random 'atiahle Z I.S normally
d1strtbuted '~ tth the tnd1cated parameters l\J (Z]t Var [Z] Let us sho\v
this \Vttbout v1olattng the general! ty of our reasoning, \Ve set mx = 0
for the sah.e of stmpllclt)' On the basts of the centrall1m1t theorem, ~e
ca.n assume for terms ha' 1ng the same distrihutton (see Sec 8 0) that
for n > m11 - 3cru > 20 the dtstribution dens1ty j(Ti) (z) tn formula
(8 63) IS normal w1th parameters m<n) = 0 Var(n) = n Var:t lsee
(5 0 33)]
f<
11
l (z) = 1
e"tp ( - --=-~zll!-}
}/2nn Var::r 2n Varx

Formula (8 63) e\:presses the mean value of the functton of the random
ergum~nt Y
N
rp (z) = ~ fPl) (z) Pn = 1\1 [j(Y) (z)J
n-1

For functions suffic1ently close to l1near funct1oJ1S 1n the range of


practically posstble values of a random argument., 've can assume
that the mean value of the funet1on ts equal to the same function of the
mean value of the random argument, 1 e that
t ( z.1 )
!\llf\Y~(z)) ~ I (m v) (z) or 'i' (z) ~ e'tp - v '
V2nmyDx 2mv arx
'vh•ch ts 8. normal distribUtion The approx1mat1on 1s the more precise
the closer to a linear function the function jCf) (z) of the random argu
ment Y IS 1n the range of 1ts practtcally pass 1bl e values m 11 ± 3cr11
Calcula t•ons ha\ e sho\\ n that ,ve can constder this functton to he
appto~1mately l1near provided that m 11 - 3cr11 > 20 For example, tf
the randorn varrahle Y has a Potsson dtstributton w1th parameter a
[see (4 0 26) 1, then the cond 1t1on \Vlll be fulfi lied for a > 40, 1f 1t bas
a h1nom1al dtstr1bUt1on. w1th parameters n, p (see (4 0 24}1 (and a small
parameter p < 0 1), then np > 40 Note that In both cases the random
vartable Y (the number of random terms) 1s appro"<:Imatel} normally
dtstributed Wtth parameters mJI ;;;;;;;;; Vary ==- a (fol a Po1sson d1str1button)
and m 11 = np, Var 11 = npq (tor a btnomtal distrtbution)
Ch. 8. Distribution s of Functions of Random l'ariables 249

8.64*. Independe nt random variables X 1 , X 2 , ••• , X 11 , • • • have the


same exponentia l distributio n with parameter 'A. A random variable
+
Y = U t, where the random variable U has a Poisson distributio n
with parameter a. Find the distributio n and the numerical charac-
Y
teristics of the random variable Z = 2j Xi.
i=i
n
Solution. The sum 2j Xi has Erlang's distributio n of order n
i=i
(see Problem 8.38) with parameter 'A:

( ) - A (A.x)n-1 -J..x (x > 0).


fn X - (n-1)1 e
By the total probabilit y formula the probabilit y density of the
random variable Z is
00 00
I. (Az)n-1
(n-1)!
n=1 n=1
00
(J.za)k
= f..e-i.z-a ~
(k!)2
for z> 0.
k=O
We can express this density as follows:
cp (z) = 'A.e-J..z-aJ0 (2 V l.za) for z > 0,
where
eo
(x/2)~k
10 (x)= ~ (k!)2
k=O
is a modified cylindrica l Bessel function. Then, from Problem 8.66,
we have
M [Z] = mxmy = (a+ 1)/'A., Var [Z] = Varx my+ mi: Vary
= (a +
1)/1..2 +
a//..2 = (2a t)/1.2 • +
8.65. Consider a system of random variables Xi (i = 1, 2, ..• , n)
related to a discrete random variable Y thus:
1, if i~Y,
Xt= 0, if i>Y.
F.The distributio n function F (y) of the random Yariable Y is known.
t~ the .di~tribution of each random variable Xi and the numerical
c wract:nst1 cs of the system of random variables (X 1 , X 2 , • • • , X 11 ).
Solution. The ordered series of the random variable Xi has the form
o 1 1
I
Xi: P {Y < i} P {Y;;;;, i} ' (i='l, 2 ' ... ' n ) .
2.50 Appllt!d Problems tn Probability Theory

Srnce P{Y < ~J = F (z), the ordered series has the form
o r 1
X t. F (f) It - F (f) t

"hence m:c 1 = 1- F (l),. Var.x 1 = F (L) [1- F (i) I


Let us find the covartances of the random variables X f and XJt for
\Vhtch purpo~e we determ1ne 11 [X ~X tl For 1 < 1 the product X1X1
can assume only two values VIZ. 1 tf X 1 = Jt and 0 tf X 1 = 0 Con
sequently lt1tX 1X1)=mx1 =1-F(j) (i<J), whence Covfl~
~l{X 1 X 1 l-m~qm~o;= 1- F(J)- [1-F(t)J[1-F(/)l=F(t)[1-F(J)i
(J. < 1) 1 and the corre la t 1on coefficJen t
Cov 11 F ( t) ( 1- F (1)] • . / F (I) ( 1- F (})]
rsJ = o~ 1 o-tt 1 ~ J~" F (t) F (J) (I-F (i}] (1 -F (I)J = V F (/) 11-F (s)} ~
8.,66. A number of Independent trials are made 1n each of whtch an
event A may occur \"Vlth probah1l1ty p The tr1als are term1nated as soon.
as the event A occurs n ttmes (n > 1) F1nd the dtstrtbUtlon and the
numerical charactertsttcs of the number X of .. ~fatlures'" 1n 'vbtch the
e\ent A does not occur
Solution~ \\ore find the probab1I1 ty that the random variable X assumes
a \ alue k For that e'\ ent to occuT, 1t ts necessar) that the total number
of trials should be equal to n -l- f... (J... outcomes are fallures and n out
comes are successes) By the hypothesis,. the last trtal must be successful
and 1n the precedtng n + k ---- i tr1als n- 1 successes and k fa1lures
must be d1str1buted at random The probabtltty of th1s occurrence IS

P{X=k}=C!+Iit-tPntf, \Vhere q=1-p(k=O, 1, ~-)


The dtstribution obtatned Is a natural generalization of a geometriC
d1stTibut1on, \\e shall -call1t a generalized geometrlc dtstrlbutl.on of order
n. It lS a convolutron of n geornetrtc dtstrtbuttons 'vtth the same para
n

meter p X = ~ X,, 'vhere each random var1able X, has a geometr1e


s:::o;;z.f
d1str tbutJo n
p { y. = h.} = pq~ (k = 0' 1 t ~ ll • ) 41

Indeed, the total number of fatlures 1s the sum of (1} the number of
fatlures t1ll the first occurrence of the event A., (2) the number of fa1Iures
from the first to the second occurrence of the event A, and so on Hence
\\e obta.ln the numer1cal charactei"tsttcs of the var1able X mx = nq!p,
V ar;.: ::=:::- nq!p 2
8.,67. The hypothesiS Is the same as In Problem 8 66, except that the
random variable Y the total number of trials (both successful and
ts
unsuccessful), made ttll the nth occurrence of the event A. F1nd the
d1strtbut1on and the numerical characteristlcs of the random vaxtable Y.
Ch. 8. Distributions of Functions of Random Variables 251

Solution. Y = X + n, where X is the random variable in the pre-


ceding problem. Hence
p {Y = k}= p {X= 7,;-n} = c~:ypnqk-71 = C/::tpnqk-n
(k=n, n+1, ... ).
The numerical characteristics of the variable Y are
my= mx +n = nq/p + n =nip, Varv = Varx = nqlp 2

8.68. There is a random variable Y which has an exponential dis-


tribution with parameter A.: f (y) = A.e-t..Y (y > 0). For a given value
of the variable Y = y a random variable X has a Poisson distribution
with parameter y:
P {X = k 1Y = y} = ~; e-Y (k = 0, 1, 2, ... ) .

Find the marginal distribution of the random variable X.


Solution. The total probability of the event X= k is
00 00

P {X= k} = \ _,_Y.,.-k e-YA,e-1.y dy = ....,..!.~ \ yke-< 1 +1-)y dy


J k! k! J
0 0

- f. kl (1+~)-(k+il- '}, (k 0 1 2 )
- k! "' - (1 + /,)k+l = ' ' ' ....

If we introduce the designations A./(1 A.) = p, 1/(1 +


A.) = q = +
1- p, we obtain P{X = k} = pqk (k = 0, 1, 2, ... ), i.e. the random
Yariable X has a geometric distribution with parameter p = A./(1 A.). +
8.69. A Geiger-Muller counter is mounted in a spaceship to find the
number of particles falling in it during a random time interval T which
~las an exponential distribution with parameter ~t. Particles arrive
111 a Poisson flow with intensity/,; each particle is recorded by the coun-
ter with probability p. A random variable X is the number of recorded
particles. Find its distribution and the characteristics mx and Varx.
Solution. We assume that T = t and find the conditional probability
that X = m (m = 0, 1, 2, ... ).

P{X=mlt}= <"-~lm e-1.pt.

Then the total probability of the event {X =m}


252 Ap pl ed Probl~ms •n Pr ob ab ili ty Theory

Th1s rs a ge om et ru : distr1hut1on '' 1t h pa ra m et


preced1ng pr ob le m ) a.nrl therefore I ~ee fo rm ul er ~I('Ap f.L) (se(' the +
as (4 0 30 ) and (~ 0 31)1

- (
).p ) 2
~l + J' Ap
=m x( m x- r-
t)

8 70 So l' e Pr ob le m 8 G9 on th e h} pothes1s
th at th e counter IS
S\ \ ttc he d on for 1. ra nd om ttm e
T '' 1th dens1t} f (t) (t > 0)
Sol ut to n \ ~ 1n th e preced 1n g pro b] em for T
dJ st rlb ut Jo n of th e '1 r1 ab le X 1s = t th e con d1 t1onal

The n1argul 'll chstrthnt1on


oc
P {\ = m }- J(A~?m e 'Aptj(t) dt (m = 0 1 2 )
()

\' e fin(t th e nun1cr1cal charactertst1cs of th


e ra nd ot n vartable \
Th e cond1 tto na l e'\:pectat1on m~u = 'p t th e 1b so
00
lu te e'tpectat1on m.x =
00

J'Aptj (t) d t- l.p } tf (t) dt = 'Apmt \\h er e m t ~ 1\f {T] By analogy M


0 0
fin d th e second mon1ent ab ou t th e orJgln of
th e random var1able l
{n ot e th at In th1~ '' 1~ ''e can on l} find ab~o
or1g1n an d no t ce nt r ll m om en ts ) lute m om en ts ab ou t the

az f X ' t 1- Apt ~ (i. p t)2


-.. or;,.

et2 [X )= J p ~ tf {t ) dt + ('Ap)Z j fi.J (t) dt


0 0
= A pm 1 L(~p) 2 a 2 ITJ A pm t+ (A .p ) 2 (V ar1 +m i)
where Va 1 t 1s th e 'a r1ance of th e 1 andon1 va rta b le T Hence
V ar 't- at {A ] -m ; = A pm t+ (A.p)2- V ar t
8 71 So l\ e th e preced1ng problen1 fo r th e sp ec ta
Er la ng s d! sl rl bu tio n of or de r (k + 1) \\ 1th pa l ca se 'vhen f (t) IS
ra m et er p,
I (t )- fk +t (t) = ll ~;t)~ e-,... 1 (t > 0)
Ch. 8. Distr ibuti ons of Func tions of Rand om Varia bles 253

Solution.
00
1-L ( ~lt)l!
P(X =m )= J (l.pt) m e-i.p t
m! k!
0
00

- 1-L (A.p)mf-l·" r tm+h e-(i.p +!l)i dt = f-l (A.p)m f-1. 11 (m+k )l


- m! kl J m! kl (A.p+ ~t)m+k+l
0
m { f-l )k-+-1 ( A.p
/,p+ 1-L
)m
= Cm+k \ t..p+ P •

Thus the rand om vari able X has a gene raliz ed geom etric distr ibut ion
of order k +
1 (see Prob lem 8.' 6) with para mete r
!1

The mean valu e of the rand om vari able T, whic h has Erla ng's
+
distr ibuti on of orde r k 1, is mt = (k + 1)/f.t, and the varia nce
Vart =(lc +1)/ f.t2 • Con sequ ently , from the form ulas obta ined in the
preceding prob lem we have
l..p t..p ( f..p )
mx= (k+ 1), Var x= ~l (k+ 1) 1+ fl ,
fl
which can be repr esen ted, as in the prev ious prob lem, in the form
lnx= (k+i ) q1 , Var x= (k-ti ) q1 , whe re qi= 1-P 1·
P1 Pi

8. 72. Whe n phys ical quan titie s are mea sure d, the resu lts are usua lly
rounded off to the leas t scale divi sion of the instr ume nt. Then a con-
tinuous rand om vari able turn s into a disc rete vari able who se poss ible
values are part ition ed by inter vals
equa l lo the scale divis ion. The V(x) '
I
f~llowing prob lem arise s corre spon -
inuo us rand om
3
I
r
dmgl y. A cont I
variable X, whic h has a distr ibut ion
with prob abili ty dens ity f (x), is
rounded off to the near est integ er;
a discrete rand om vari able Y =
~'(X) resu lts, whe re l' (X) is an I....---''-7 -
lllleger near est to X. Find the I
ordered serie s of the rand om vari a- I
-2- r
ble
• •
r and ils num erica l char acte r- I
I
-J
J>-llcs. I
. Sol~lion. The grap h of the func -
h?n T' (X) is give n in Fig. 8. 72. Fig. 8.72
\\hen the dista nces from the valu e
~! ·~11 lo tw~ nea.rest integ ral va_h~es are equa l, the rule of roun ding off
f~ w.ssentwl smc e the proh ab1h ty that a cont inuo us rand om vari able
a11 s 1n any poin t is zero.
254 Applied Problem s in Probabt l,ty The~ry

The probab ility that the random variab le Y assumes an Integral


value h. IS
A+O'
P{Y= J..}= ~ f(:r)d:t (k=O. ±1, ±2, ... ),
A-0 !>

whencP
oo k,.O 5 C!IQ h +0 5o

m11 = ~ k ) f (x) dx Vnr11 = Z k2 \


...
f(x)dx-m~*l
A~-oo A:- 0 ~ ~=-CIO h- D :>

8 73 The rttndom var1ables Y and l'" are mutuall} Independent


and ha\ e Po1sson 's dJstrtb utlons '' tth parame ters a and b Ftnd the
distrJb utlon of the1r d1fference Z ':;;.": X - Y and of th~ mcdulu s of theu~
d1fference U = ' ,:y - Y I == r ZJ
Solutio n. The random 'arJabl e Z may be either pos1t1ve or negatile
The probah llity that Z assume s the \alue k > 0 ts equal to the sum of
probab ilities that X and Y Will a~sume t" o 'a lues differin g by h. (l be
1ng greater than or equal to lfl')

The probab 1ltty that Z w1ll assume a nega\1 \ e value --k IS

For the random. var1ah le- U

m=O
CIO

p {U = k} ~ ~ (ab)m (ah+bli} e-<a+b) (h.> 0)


mJ (m.+k)J
m:o=O

These ptohab1l1t1es can be '' r1tten b:y means. of a modtfie d eyl1ndr1ctd


Bessel functto ns of the 1st ktnd
00

- -k { X) ~
l h {x ) -J " 1
-- LJ m1 C!c+m)l
z ) "+2771 (k
(2 ==:
0 1t
t 2t ~. ) **)
m~o

.-.} lf the un1t of measure ments (the value ol the dlVJSlOn of the Instrument)
IS small as compare d to the range of the poss1hle values of the random. var1ahle X'
then mil~ m. Varv ~ Var~
••} The taEies of cyhndri cal Bessel functiOns of the fst k1nd can he found 1n
te[~rence hooks
Ch. 8. Distribution s of Functions of Random Variables 255

In this case
P{Z=k}= ln(2Vab) (fr 12
e-<a+b> (k=O, ±1, ±2, ... ),
P {U = 0} = 1 0 (2 Vab) e-<a+b),

P {U = k} =In (21/ab) [( Tr' + (i 2


rk
12
] e-<a+b> (k > 0).

8.74. Given the joint probabilit y density f (x, y) of the random vari-
ables (X, Y), find the probabilit y density g (z) of the difference Z =
X- Y.
Solution. The probabilit y density function of the system (X, -Y)
is f (x, -y) and, therefore, we find from the equation X - Y =
X+ (-Y) that
00

g (z) = G' (z) = Jf (x, x- z) dx.


-oo

If the random variables X and Y are mutually independe nt, then


00 00

g(z)= ~ fdx)f 2 (x-z)dx= ) fdY-z)f 2 (y)dy.


-oo -oo

8. 75. Find the probabilit y density of the difference of two independ-


ent random variables which have an exponenti al distributio n with
parameters f.. and It: Z = X - Y, / 1 (x) = f..e-:\.x (x > 0), f 2 (y) =
~e-IIY (y > 0).
00

Solution. g(z) = \ 11 (x) 12 (x-z) dx; 11 (x) is nonzero for x> 0;



-oo
f2(x-z) is nonzero for x-z> 0.

(a) z> 0,
:z:

Consequen ti y

g(z) =
Af.te-i.z (1.. + ~tt 1
for z > 0,
l.f.te~:z: (/.. + ~tt 1
for z < 0.
The parameters of this distributio n are
1 1
m.. = - - - = '
. '). !l '·!l
256 1pplled Problems tn Probability Thtorg

The d1stribut1o n cur' e has the fornt shown 1n Fig 8 75a For l = JL
"e get g (z) -} e "' I (Fig 8 7ab) Tlus distrlbUttO n IS I~ no\\ n as a
L., pia ce d tstrJbu t 1on
8 76 A contolution of tuo dtstrlbut1ons of tu o nonnegative random
tar1ables G1' en t'\ o nonnegat 1' e random 'artables .A and Y '\ tth pro
h 1b1llt} densities / 1 (x) (x > 0) and f 2 {y) (y > 0) form the conloluhon
of the tr d tstrtbu t1ons

;tp g(z}
A/2 g(t)
A.,u

0 z 0 z
(aJ (!J)
Fur 8 7;,

Soluhon ~ssunte Z X _.. } \\ e find the probabtlit } denslt) g (z)


of the r1.ndom '1r1able Z B) the general formula (8 0 5)
100

g (z)- Jfdx) !'I. (z-x) dx


~

TAh1ng 1nto account th'lt / 1 (x) = 0 for x < 0 and f1 (z- x) - 0 for
x > z \\e obta1n

r ft(x)f,(z-.x)dx
:1

g(z)- (8 76)
0

8 77 There 1re t" o mutuall1 Independe nt random 'ar1ables !


1nd Y "h1ch ha' e the sa1ne normal distribUtio n "tth par1.meter~ m-r -
1n u = 0 u l' a IJ Fxn d the probab 1h t) dens1 t} of the sum of thetr mod uh
Z tX~+IY~
Soluhon \\ e designate {) - ! A. 1 f _ 11 1 Their probab~ltt)
dens1ty functtons / 1 (u) and t ,_ (v) are

r- ~2 e'\ - u2 for u>O,


/J (u)
for u~O

f:t (V) =
f{ l
_:.
2n a"
P~p ( - v:y }
2o
for v>O
l 0 for v~O,
respectt~elj
Taktng formula (8 76) Into '\CCOunt ,\ e get for the con\ o] utJon of the
Ch. 8. Distributions of Functions of Random Variables 257

distributions of nonnegative random variables


z
g(z)= Jfdu)f 2 (z.::....u)du (z>O)
0
or

,.~ ) crx~y ~ exp [ - ( 2~~ + 2~~- ~t + 2Z::~) ]au.


11
2
g(z)= ( • (8.77)
0
This integral can be expressed in terms of the error function <D (x), for
which purpose we isolate a perfect square in'the exponent of expression
(8.77). After the requisite transformations we obtain

g (z) =
2 v2
1( :n: (a~:+ a~) exp
-z2
t 2 (cri,+cr~)
I )

X
[ <D ( Ox Vai+a~
zay ) + <D (
Oy
zl1::.
Va5:+a~
) J.
8. 78. A number of messages are sent over a radio channel. The length
of a message X is a random variable which has an exponential distri-
bution with parameter A. (Fig. 8.78). The interval L between the messages
t

~<~v-----"::7"\\-v-lj'-v'Y\\- v1-;·~~~7'\\-,-~1~~~t
X1 L, Xz Lz X3 LJ X4

Fig. 8.78

is a random variable which has an exponential distribution with para-


meter p,. The lengths of separate messages and of the intervals between
them are not correlated. Find the probability that no less than m mes-
sages will be sent during the time t (m > 1).
Solution. The total length T of m messages plus the intervals between
them is a random variable which has a generalized Erlang's distribution
of order 2m- 1 with parameters 1

A., A., ••• , A.; p,, p,, •.. , fl·


m times (m-1) times
. The probability that no less than m messages will be sent during the
hme t is none other than the distribution function of the random vari-
able T:
P{no less than m messages during the timet} = P{T < t} = G (t).
The random variable T is the sum of two independent random vari-
n~les: T = T1 +
T 2 , where T1 has Erlang's distribution of order m
With parameter A.:
(/.t)m-1 -], t
gt ( t) = (m-i)l e · (t > 0);
258 Applitd. Probt!mt tn Probabtllty Theory

T 2 has Erlang s distribUtion of order m - 1 \vtth parameter f'.


g (t} = (l.u)m.-t e-~e (t > 0) ..
! (m-2)J

Consequently, for the ca~e A>~ ''e have


o:)
. t

~ (m.-1)!
{tl (A,'t)m.-l (~- ~)]ft1.-l
g ( t) = g ('t) g (t- T) d-e== dt
J
0
1 2
e-J.t
{nt-
0
r
2}~
e-J.!{f-1')

m .... 2
- Am.-t!J.m-5! ~ C' tm.-2-i ( -1)1 (m- t+()t
- (m -1)! (m-2)1 LJ rn- 2 (A- J.L)m+l
1;;;;;;;2 0

n -1+-\
X [ 1- ~ H"--:r> tl" e-tl-1.1>1 Je-111 (t > 0).
ll=O
For the case tJ. >A \Ve ha\ e
I m-1
"'m-tpm-2
g (t) =j C1 (t- 't) C2 (-r) d"t" = (m- 1)1 (m.-2)J ~ C~-ltm-t-1 ( -1)'
0 j:;;;;;;~J

m-1+~
(m + f) I [ i ""
X (~-h)m+l+t - LJ
k=-0
For the caso A= fl we have
(A')tm-2
g (t) =A (Zm- 2)l e-At (t >OJ.
8 79 . A pendulum makes free continuous oscillations, the angle tp
(F1g 8 79) varying '\\ 1th the time t accord 1ng to the harmonic law
<p = a Slll (wt 9)~ '\\here a lS the amplttude, +
oo 1s the frequency, 8 15 the oscilla t1on pha~e At
a moment t = 0,. whtch 1S not related to the
post t ton of the pendulum, 1 t 1s photogra.pbed
Ftn d the dts tr1but ron of the angle dJ 1\ h1ch the
ax1s of the pendulum makes w1.th the vert1cal
at the moment of photographing F1nd the mean
value and v ar1 anee of the angle ttl
So\u\tOn $ = a Sln 8, vY"'bere the phase B lS
un1forml y d Istr1bu ted 1n the interval (0, 2:t ),
on that Interval the function cp =: a s 1n fJ lS
nonmonotonzc The solution of the problem ,vill evtdently not change
If the Variable 6 IS COnSidered to be uniformly distribUted JD the lnterva}
(-n/2~ n/2)'t where the funct1on ~ 1s m.onotont~ The prohab1l1ty dens tty
of the angle tlJ IS

g(~}=_!_ 1
.!.= 1
for I cp I <a
n Vt-("*V a nVa~-w~
Ch. 8. Distribution s of Fzmctions of Random TTariables 259

Since the distributio n g (cp) is symmetric , its mean value mq1 = 0.


The variance of the angle <D
a
1 ) cp2 dcp a
2
Var = - = .
rp .n ya2-cp2 2
-a

8.80. Given n mutually independe nt random variables T i


(i = 1, 2, ... , n), each of which is uniformly distribute d on the interval
(0, c), find the distributio n of the number Y of the random variables
(points) Ti falling on the interval (a, b) c (0, c). What is the limiting
distribution of the random variable Y when n --+ oo, c --+ oo and the
average number of points on the interval (a, b) remains constant?
Solution. We introduce a random variable Yil which is an indicator
of the event Ti E (a, b):
1, if TiE (a, b);
0, if Tj(l(a, b).
It is evident that

' .
We designate •
~ . - ...... ..._ ........

b-a c-b+a
P {Yi=1}= p= c
, p { y i = ,0 } = 1- p :;= c •

Considering n values T 1 , . • • , Tn to be n indepen~ent trials, in each


of which the event T i E (a, b) may occur~ we see that the random variable
Y has a binomial distributio n with 'par.ameters n and p and- mean
value M (Y] = np: ' · -

P{Y = k} = C~ph (1 - p)n-h (k = 0, ... , n).


Let us consider the case when c -+ oo, n > oo, n/c = '). = const.
~n this case p -+ 0 but the average number of points falling on the
mterval (a, b) is constant: l\1 [Y] = /.. (b - a) = const. We know (see
Sec. 4.0) that in this case the limiting distributio n of the random vari-
able Y is a Poisson distributio n with parameter d = ').. (b - a):
dh -d
P{Y=k} = kl e (k=O, 1, 2, ... ).
T~le number of events in a stationary Poisson flow with intensity ').,,
falling on the interval (a, b), has the same distributio n.
Thus we can make the following conclusion : a stationary flow of
eYer~ts with intensity ').. can be considered to be a limit case of a col-
l:c~wn of n independe nt random points on the interval (0, c), each of
''Inch has a uniform distributio n on that interval, provided that
n - co, c ->- oo, but n/c = /.. = const. We shall need this model of an
e1ementary (stationary Poisson) flow later on.
17•
CHAPTER 9

Random Functions

9 0 A functLon X (t)l! called a random function 1f its value 1s a random \arJab"


for any argumant t B.xamples or random !unctlorn V (t} the supply \oltage!
a computer dppend1ng on t1me t T (h) the ~ur temperature at a gtven polnt ;
a g1 ven moment depend 1ng on the al h tude h abo\ e the ground Q (t) the numb1
of hmcs a computer fa.11s durtng the t1me from 0 to t
The- concept of a random funct1on 1s a generahz1t1on of the concept of a randm
var1able S1nce w·e can constdcr a random variable X to be a function of an eleme1
tary event (il (s~c Sec ' 0) X =
<p (m) (ffi E Q) \VhrrP Q is a !pace of elementat
events or sample space 1t follo"s that the random function X (t) can be repr1
sented as
x (t) === wCt (t)} ((t) E ta. e n,
\\here a nonrandom argument and T Js the domain of the function X (t}
1 IS
The rtaliza"on of a random functron X (t} 1s a sp~c.dic form It a~suroes as a 11
suit of an ex'{)erlment (\\h.e.n the elementary event m h.a.i occurred\ Far exam~h
reg1ster1ng b) an Instrument the suppll voltage of a computer agatnst t1me on tb
Interval (0 T) we get a reahzatJO:
v {t) of the random funchon V (l
v(t} (Ftg 9 0 t)
A number o! tr1als the outcome o
eac.h of \Vh1ch 1s a random !unct101
l l l. (t) y1elds a collectJon of realn.at1on
J I x 1 (t} x 1 (t) Zn (t} of this randon

l I function fl ea}Iza tlOns 1nev1 tab}y d1ffe


from one another due to random cause
I ~ (F1g 9 0 2} For a fixed mom. en t t th1
0 random funchol'l X (t) turns 1nto a1
ord1nar1 random var1ahle- Thls randon
var1able 1s knol\11 as a stet ton of a ran don
F1g 9 0 l function
If we constder several secttons as~
ser1es of po1nts t 1 t 2 t,. rather thar
one sechon of a random function 've get an m d 1mens 1o:nal random vector v.hicl
describes 1t In some approxunat1on (F1g 9 0 3) In pra~t1cal apphcattOn! Jf th1
ru
values of a random n-et l on. a. re reg 1stered \'V! t h some 1nten al {or the val u~s t1
t2 tn or tha argument then we deal 'v1th an n d 1merunonal random vector
A random fun c t 1on X (t) \V hose argument i 9 t 1me 18 usually ca II ed a stoch as tu:
or random process If stochastic process takeQ place 10 a phys1cal s~stem S then th1~
means that the state of the system varu~s at random w 1th txme t If the state ol tht
systemS at a moment t can be descrtbed by one scalar random variable X (t) then
we deal w1th a scalar tandom junctlon (a scalar stochastic process) X (t) If the state
of t.he system S at a moment t u descr1bed by several rBndom var1abl~s X1 {t)
X a (t) X A (t) then "e deal \VI th a vector random function 1' {l} (a 'ector
stochasttc process} "1th k components X 1 (t} X 2 (t) Xk (t)
Stochastic proceQses are clasg1fied accord1ng to a number of cr1ter1a A .stochastJc
proce~s tak1ng place In a system S 1s kno"n as a process with ducrete ttme tf the
transformations of the system S from one state to another iJ.re only poss1ble at cer
ta1 n pred e term 1ned moments t 1. t,. E. x.am pIes o£ a. -process \V1th d1screte
t1me a computer that changes lts states at moments t1 t., preset by thd
machine eycle hme a deviCe tnspected at rnoments t 1 t,. • and transferre
from one category to another as a resu1 t
,.
Ch. 9. Random Functio ns 261

A stochastic process with discrete time is also call~d a random sequence. If t~e
state of the system S is described by one random varwble X, then the stochastiC
process is a sequence of random variable s: X {t1 ), X {t 2), • • • , X (tn), ••.•
A stochastic process taking place in a system S is known as a process with con-
tinuous time if the transitio ns from one state to another can occur at any random
time moments which continuo usly fill the t-axis (or an interval of the t-axis).
Examples of a stochast ic process with continuo us time: a change in the supply

X3(t)
X(t} xdt)
~---xn(t}

Fig. 9.0.2 Fig. 9.0.3


f

voltage of a compute r l' (t) (see Fig. 9.0.1) or a function ing of a device which fails
and is reconditioned at random time moment s. · ·!
A stochastic process taking place in a system S is known as a process with discrete
states if the number of possible states of the system S is finite or countab le. An exam-
ple: a device consisting of two units for which the possible states of the system are
sh both Ullits are sound; s 2 , the first unit is faulty and the spcond unit is sound;
.~3. the first unit is sound and the second
unit is faulty; s 4 , both units are faulty. X(t)
Another example: tran10mission of a roes- 8
sage over a radio channel . Here the I •
s~ochastic process X (t) is the number of
r--.e
.r--.e•
distorted symbols transmi tted up to a I ~~
moment t. This stochast ic process may r - " 1_ _ _ . .

only have a countab le set of states


{0~ 1, ... , n, ... } and "jumps" up by
2 I
..... •
~mty ~t the moment a distorte d symbol
IS recetverl (Fig. 9.0.4). t
A stochastic process taking place in Fi<t 9 0 4
a system S is known as a process with "'' · ·
continuous states if the set of possible
~tates of the system S is uncount able. Exampl e: the process of placing a spacesh ip
\~ a predetermined position about the earth. Another example : the supply voltage
(t) of a compute r.
By the criteria indicate d above stochast ic processes are classified as follows.
1a. Processes with discrete states and discrete time.
1b. Proce~ses with discrete states and continuo us time.
2a. Processes with continuo us states and discrete time.
2b. Processes with continuo us states and continuo us timP.
E!ample s of a process of type 1a. A certain Petrov bought m tickets of a lot-
1ery-Ioan,,which may win or be payed off at certain times (drawings) t • t , • • • ,
includin ~
2
a stochastic process X (t) is the number of tickets that have won up to and
a m?ment t. Another example : the state of the on-line storage of a compute r; al1
posds:ble states of the storage can be indicate d and changes in state may only occur
at 1scretc momcmts in accordance with the machine cycle timP.
f Example of a process of type 1b. An instrum ent may be in one of the following
o~r states: s 0 , sound and not switched on; s1 , sound and switched on, s., faulty
(\pi noft switched on, Ss, faulty and switche d on. We shall encount er many Ham-
es o a process of type 1b in chapters 10 and 11.
262 A ppll~d Problems ln. Probabfllty Theory

E 1.:am ple of a of type 2a 'I he valuf!s X (t1). X {t 1}


proc~~ts of a eont n
uous random variable X s.reo obser\ed nt moments tl 11 The sequenc& ot
'alues th1~ varLable as~umes 1! a proce~C~.s X (t) w1 th continuoll8 states and dJ.screte
tJme For exantple If the a1r tempera.ture T 1s measured tw1ce a day, then the
~equence of record~d 'a lues o' T JS a stochastic proces-' With conhnuous states and
d1screte tune
Ex"lmp!e of a process o! typP 2b the proces' l (tl ol variation of the supplf
voltage of a computer at any mom~nt tor of the level of notse u1 the tranitrDJ.ss~oQ
of a message
Let us tonstder some of th~ charactfrJs hes or
a scalar random !unction X (t
A unllarlalt di~tribul!on of a rnndom Vlitiah\e X (t) IS the dtstr1button of the
~echon X (t) of that randoru functiOn for any val uP of tbe argument t If the random
varJable X (t) tS conttuuou4ill then thLS dlSlt'thutlon 1s thP 1Jrobabtllty den•tty func
hon of the EPCt1on X (t) and 1s destgnnted a• f (t t) If the random vo.nable X (l)
ts dJscrete then the un1var1ate dlstrJbntJOD of the r.andom function X {t) IS an
ardered .ser1es of the probabthty P {z t) that the random vartable X (f) assumM
a value x 1 at a moment t For a n11xed random , ar1ab]e X (l} the un1var1ate d!strl
huhon ts speclfied by a dtsttihuttnn [unctton F (x t) = P {X 't} < .zl Stnce a dJ.9
tr1bUt10n funct on ts the most general form of a di<=trlbutJon and !S su1tab1e for any
random var1ahle "e can u~e thfl general notatton F (.t t} for a. nntvariate dlstrt
but1on too
A b u.-anatt dlstrtbZJ. tJon of a random function X (t) 1S a JOint d LstributiOn ol two
of lts se-ct1ons (X (t1l and X (ttl for a.ny values oft\ and t,l The~ts a !unchon of lour
arguru en ts .:r1 :z 2 t 1 t 2 'V e ea n correspond 1ng1y cons l der an n t aria te d utrl bu Uttn
1\b1ch IS dependpnt on 2n arguments
A random function X (t) JS said to be ~ormaltf the JOint ~~strJbUtlOn of any num
her n of lt'= !ecttons taken at a.rb1trary moments t 1 < t~ < t 3 < < t, ts an
n va rt ate normal dIS tr1 hut ton
The mean t-olut of a random furtchon X (t) 1s a nonrandom funct1on mx (t) \\hH:h
1s the mean value ot the correspond1 ng section of the random function for each
"lalue of the argument
mx (l) = 1-tf [X (t)J (9 0 1)
A correlation junttran (or autocorrel~hon function) of a random Iunchon X {t)
is a. nonrandom fllllChon of t\,o arguments n~ (t t ) ,vhtch 1s equal to the covan
once ol the correspondi ng ~eehonq of the random function !or each patr of "alues
of the arguments t t
Ill
R$ (t t ) ~ 1\l f X (t) X (t )] ,
Q

''here X (t) = X (t) - m=r. (t} lS a centrtd random {unct1.on


Fort = t the correlation funetJon turns into a var1:1J1Ce of the randomlunc bon
R:,: (t t ) =::;. V a.tx- (f} '= Vat [X (t}l ~ \tlx (C)12 (9 0 3)
The ma 'n proptrtLes cj a r:orr~latton fu tctlon
(1) Rx (t 1 ) = Rx (! 1) l e the function R lt t ) does not vary v..hen C1!
replaced by t (symmetry)
(2) j Rx (t t ) l ~ Ox (t) Ox (t }

(3) the function R,. {t t ) IS po~utlve defin• te 1 e J J Rz (t, t ) 'P (t) 'P (t )'<
(B} (B)
dt dt ~ 0 "here ql (f) 1s any function and {B) 1s an) doma1n of u1tegr8tlon
v.. b.1ch 1s the same {or both arguments
For a normal random function the cbaracteristJCS mx (t) R~ (t t ) are exbaus
t1ve and define the dtstributlon of any number of sectlons
A normed correlatwn junetton of a random Iunctlon X (t) 1~ a function
t )= R :t (t t ) = R "t (t t ) l9 c 4)
oi%' (t) a~ (t J 1' Va rx (t) V.ar~ (t )
• Ch. 9. Random Functions 263

i.e. for t = t' the correlation coefficient rx (t, t') of the sections X (t) and X (t')
is equal to unity. ,
A stochastic process X (t) is known as a process with indepf!ndent increments if,
for any values of the argument t 1 < t 2 < t 3 < ... < tk < tk +1 < ... the ran-
dom values of the increment of the function X (t)
U1=X (t 2)-X (t 1), U 2 =X (t 3)-X (t2), ••• , Uk=X (tk+l)-X (tk) (9.0.5)

are independent.
A normal stochastic process with independent increments is known as a Wiener
process if its expectation is zero and the variance of the increment is proportional
to the length of the interval on which it is attained:
mx (t) = O, Var [ Uk) = a (tk+l - tk), (9.0.6)

where a > 0 is a constant.


When a nonrandom term qJ (t) is added to a random function X (t), the same
nonrandom term is added to its mean value, but the correlation function does not
change.
When a random function X (t) is multiplied by a nonrandom factor qJ (t), its
mean value is multiplied by the same factor rp (t) and the correlation function is
multiplied by cp (t) cp (t').
If a random function X (t) is subjected to a certain transformation At. another
random function, Y (t) =At {X (t)}, results.
A transformation L! 0 ) is known as a homogeneous linear transformation if
n n
(1) L~o>f 2j Xk (t) j = 2j L~o >{Xk (t)}
k=i k=i

(i.e. the sum can he transformed term-by-term), and if


(2) £~0) {eX (t)} = c£~0> {X (t)},

(i.e. a factor, which is independent of the parameter t with respect to which the
transformation is carried out, can be put before the transformation sign).
A transformation Lt is called a nonhomogeneous linear transformation if
Lt {X (t)} = Li 0 ) {X (t)} + qJ (t),
where QJ (t) is a nonrandom function.
f If a random f11nction Y (t) is related to a random function X (t) by a linear trans-
orrnation Y (t) = Lt {X (t) }, then its mean value my (t) results from mx (t) upon
t he same linear transformation:
my (t) = Lt {mx (t)}, (9.0. 7)
and. to find the correlation f~nction Ry (t, t'), the function Rx (t, t') must he twice
subJected to the correspondwg homogeneous linear transformation, once with re-
spect to t and once with respect to t':

(9.0.8)

. The cros~correlation function Rxy (t, t') of two random functions X (t) and Y (t)
IS a functiOn .
'
e o
Rxy(t, t')=l\l[X(t) 1"(t')). (9.0.9)
It follows from the definition of a crosscorrelation function that
Rxy (t, t') = Ryx (t', t).
25 \ Ap pl ted Pr ob lem s ln Pr ob ab tll ty The~ry

Th e normed croyscorr~latlon ju nc t ton of t\\ o ra nd om funct1ona X (t) and Y


is a funct1on (t)

(t ,. t ) -
_ R xy (tt t ) _ Rxu (t~ t )
Tx y -
Ox (t) ou (t ) Y \ arx (t) Va r11 (t • (9 0 10)
)
The ra nd om funct1on~ X (t) .and Y (t) ar e un to "e la te d 1f R~JJ (l. t'} $
0
If Z (t) = X {t) +
Y (t}, th en mz {t) = nlx (t) m9 (t) , +
R: (t t') = Rx (t,. t') +
R y (t'" t'') Rx y (t, t ) +Rx11 (t, t') . +
If t\\ o ra nd om func t1ons X (t) an d Y (t) ar e unco
rrelated, then

If
R~ (t~ t') = R:c (t. t') + R t~ (t. t') (9 0 if)
n
Z (t) = ~ X k ( t) ~ (9 0 t2)
It- t
, Xn (t) ar e un co rre lat ed ra nd om fu nc ho ns , the
n
n n
mz (t) = ~ mx (t)!o R~ (t!o t ) = 2J Rx (t, tt)
k=-1 k A= t k
\Vhen tran.sformtng ra nd om fu nc tio ns 1t l! of te
a co mp lex form A co mp lex ra nd om var1abl~ ts a u co nv en ten t to 'M'tte them lD
ra nd om fu nc tto n of the form
Z (t) = X {t) + tY (t). (9 0 13)
wh er e X (t) an d Y {t} ar e re al ra nd om fu nc tto ns
Th e m ea n value th e co rre lat iO n funct1on anan d t 15 a un1t unagiJJ.ary number.
d var1anee of a complex random
funct1on are defined th us

m:r (t} = mx (t) + im11 (t) , Rz (tt t ) = \f (X (t} X (f' )],


0 0
{9 0 {4)
wh er e th e ba r ov er th e le tte rs de no tes a eo m pl
ex co nJ ug ate qu an ttt y and
Varz (t) =
R~ (t, t ) = AI {I (t) 111. X (9 0 t5)
Be fo re constder1ng co mp lex ra nd om va r1 ab les
define va rta nc e as th e m ea n va lu e of th e .square an d fu nc tio ns it 1s necessary ta
of th e modulus,. an d covartance a~
th e m ea n va lu e of th e pr od uc t of a centrerl ra nd om
ga te of an ot he r ce nt re d var1able• * var1ance by the complex conJU·
Th e canontcal decomposltton of a random va r1
m th e form ab le X {t) fs 1ts representaboD

m
X (t) = m~ (t) +~ Vkq'!Jt. (t)••>, (9 0 16}
A=-:1
wh er e Y~r (k = :t, 2 , m) ar e ce nt re d un co rre la te d ra nd om varzable
va ria nc es VarA {k = 1 2, t .m) an d 'Fk. (t) (.l =
.s w1tb
funct1ons Ra nd om var1ables Vk (k = t,. 2. 1 t 2, •t m) ar e no nr an do m

an d fu nc tio ns Cf'tt (t) {k = 1 1 2. .. ., m) as co , m) ar e kn ow n as co eff itlt nf8


compositiOn ordtnate fu nc tio ns of a c.anon1cal de ·

•1 In \v ha t follows we sh al l1 nd tc at e th e co m pl ex
!.very t1me, tf 1t lS uo t sp ec ifi ed , we Bhall co ns id na tu re of a ra nd om fu ne tto l
* •> In pa rtt cu la rt th e su m ca n be ex ten de d toer ana ra nd om fu nc tto n to he
in fin ite (c ou nt ab le) num er
rh
lf ter ms
Ch: 9. Random 'Functions 265

.i' If the random function X (t) admits of a canonical decomposition (9.0.16) in


a real form, then the correlation function Rx (t, t') is expressed by the sum
m
Rx (t, t')= 2j Vark cpk (t) cpk (t'), (9.0.17)
h=1
which is known as a canonical decomposition of a correlation function.
If a random function X (t) admits of a canonical decomposition (9.0.16) in
a complex form, then the canonical decomposition of the correlation function has
the form
m
Rx (t, t') = 2j Vark (jlk (t) cpk (t'l, (9.0.18)
k=i

where a bar over the letters denotes a complex conjugate quantity.


The possibility of a canonical decomposition of a correlation function in the
form (9.0.17) or (9.0.18) implies a representability of a random function X (t) in
the canonical form (9.0.16), where the random variables Vk (k = 1, 2, ... , m)
have variances Vark (k = 1, 2, ... , m).
A linear transformation of a random function X (t) defined by the canonical
decomposition (9.0.16) results in a random function Y (t) = Lt {X (t)} also in
a canonical form, i.e.
m
Y(t)=my(tl+ 2j VkWk (t), (9.0.19)
k=i
where
(9.0.20}
That is, in a linear transformation of a random function, defined by a canonical decom-
position, its mean value is subjected to the same linear transformation and the coordinak
functions are subjected to the corresponding homogeneous linear transformation.
. A stationary*> random function X (t) is a random function whose mean value
Is constant, mx = const, and whose correlation function only depends on the differ-
ence between its arguments: Rx (t, t'\ = Rx (T), where T = t' - t.
From t~e symmetry of the correlation function Rx (t, t') it follows that Rx (T) =
Rx (-t), I.e. the correlation function of a stationary random function is an even
function of the argument -r.
The variance of a stationary random function is constant:
Varx = Rx (t, t') = Rx (0) = canst. (9.0.21)
The correlation function of a stationary random function possesses the property
I Rx (•) I ~ Varx. (9.0.22)
The normed correlation function of a stationary random function is
Px ('t) = Rx (T)IVarx = Rx (T)!Rx (0). (9.0.23)
The canonical decomposition of a stationary random function is
00

X(t)=mx+ :2} (Ukcosrokt+Vksinrokt), (9.0.24}


k=O
'p·Y~ere. rk
(k = 0, 1, ... ) are qmtred uncorrelated random variables with
Uh,
nirw1se equal variances Yar [ Ukl = Var [ Vh] = V ar,..
A Hepresentation (9.0.24) is known as the spectral decomposition of a function.
spectral decomposition of a stationary random function is a~sociated with a series

*> Stationary in a broad sense.


266 .A.pp lfed l'robltml In Proba btltty Thto ry

expatl.slon of IU correla. bon £unction·


QQ

R :t ('t) = 2J
1i 0
Va r" cos IDJt 't
;1;;;;11"
(9 0 25)

\Vhence
00

Var %-= }2 \ ar);E.


~t~o

Set hog w0 = 0 "' e can rewTlte the spectral decomposition {9 0 24) of a .sta-
tzonary r"ndom !unct1on in a complex Iormt 1 e i
co
i c.> Jt. I
~ {t) =m-1+ "'U
L.i 11~hct ~90 27)
J;~-1)11)

,.,. here
JJk=jl11-fV~
2
'I'he- .rpectral d:nsUy ~f a s\ation3't'y Tandom funct1on X (t) 1s the l1mit of tM
rauo oi the \ar1ance per a g1\ t'n 1ntprval of frequencies to the length nf the tnterval
as the latter tends to zero The spPctral density Sj ((I.)) and the eorrelatzon lunct1on
R:c (ri are related by Fourter s tra.nslormatlons n the real form tlus rfJat1on is
~ CIO

S" oo) = ~ JR., {1') cos lil'f dt Rx {'t") = JSac ({t)) CO.Hil1' d{,l (9 0 28)
0 0
It follo\\ s from the last re latlon that

J
Oil!

\ arx= Rx (0) = Sx(c.J) dw {9 0 29)


0
In a tomplex [orm Fourier 5 transformahonst 'vbtcb relate the spectral df=n!ut.y
~; (CJl.) and the correlation funchon Rx (T}, hale the form
00 CIO

j Rx(t)e-""Tdt, R~ {'f)= j s:(w)e'llYidw (9 0 JO)


-oo
Appendtx 7 1s a table of relahons bet\\fen certa.1n correlatton functions and
!pectral densl ttes
Both S~ (oo) and S~ (C&l) are nonn~gattve real {unettons~ S~ (~) 1s an .even func-
tion defined. OU the interval (-oo +co) S:f. (ro) lS defined on the 10terval (0, +co),
and on th.lS 1nwrval S~ {w} ~ {) 5S¥ (()))
A normed sptctral dtnsfty 'x (m) [or sJ (w) tn a comples form] IS a spectral dell
tlt} dlVlded b) the 'ar1ance of the random function X (t)
.f,:t (fJl) = s X ((1))/Var~, s~ (~) = s; (~\Nar~ (9 0 3tl
If a cros.scorrela tton funct1on R r. Y(t, tJ) of tVt o sta tJonary random functions
X (t) and Y (t) JS a funchon of onll 't = t' - ~' then \\e say that they are station
ar!J connected Tbe follo~.tnt .t~latlOWi t_hp.,n_ P:tJS.l bP~l\l.e..P.J:l.lhP.. cro!scorrelated func-
tion Rx!J (T) and the cross power spectral dtnstty S~v (w)t l\b1ch are defined by the
Fou.r1e-r transform.at1on in a complex fol'Ifl:
co

J
(!!)

S *ru (oo} -~ 2:t


t Jr n- tW'f R .. J, (or) d.,...
~ .. . .
~ R" 11 { t") = e l ..,"~" S iv (Iii) dw (9 0 32}
-OD -~
Ch. 9. Random Functions 267

If the normal stationary random functions X (t) and Y (t) are stationary con-
nected, then the random function Z (t) = X (t) Y (t) is stationary with character-
istics
mz = m:-.·my + Rxy (0), (9.0 .33)
00

S~ (w)= ) S~ (w-v) Slj (v) dv+mxmy [S~y (w)Sux (w)]


-oo

+i
00

S~y (w-v) Stx (v) dv+m~St (w)+m3S~ (w'. (9.0.34)


-oo

In a special case, when Z (t) = X 2 (t), we have


7nz= m~+ Varx=m~+Rx (Ol, (9.0 .35)
00
'
S~(w)=2 ~ S~(w-v)S~(v)dv+4mlS~(w). (9.0.36)
-oo
White noise (or white noise in a wide sense) is a random function X (t), whose
two different (arbitrarily close) sections are uncorrelated and whose correlation
function is proportiona l to the delta-functi on:
Rx (t, t') = G (t) o (t - t'). (9.0.37)
The variable G (t) is known as the intensity of white noise. The definitions and
the properties of the delta-functi on are given in Appendix 6.
Stationary white noise is white noise with constant intensity G (t) = G = const.
The correlation function of stationary white noise has the form
Rx (•) = Go (•), (9.0.38)
whence it follows that its spectral density is constant and equal to
s~ (w) = G!2rr.. (9.0.39)
The variance of stationary white noise Varx = Go (0), i.e. is infinite.
If a stationary random function X (t) arrives at the input of a stationary linear
system L, then, some time later, the time being sufficient for the transition process
to he damped out, the random function Y (t) at the output of the linear system will
also he stationary. The spectral densities of the input and output functions are
related as
S*y (w) = S~ (w) I cD (iw) 12, (9.0.40)
where CD (iw) is the amplitude-f requency characterist ic of the linear system .
.T~e stationary function X (f) is said to possess an ergodic property if its charac-
!m~tlcs [mx, Rx (-r)] can be defined as the correspondi ng t averages for one real-
Izatwn of sufficient length. The sufficient condition for ergodicity of a stationary
~ndom function (with respect to expectation ) is the tending of its correlation func-
tion to zero as -r-+ oo:
lim Rx ('t) = 0. (9.0.41)
"t-+00

F~r the random function X (t) to be ergodic with respect to variance Var it is
sufficu.>nt that the random function Y (t) = X 2 (t) should possess a simila~.' (the
same) property, i.e. lim Ry (T) = 0 as -r-+ oo.*l
-t-oc

. .*> For a random function to be ergodic with respect to a correlation function


It IS necessary that the function Z (t, -r) = X (t) X (t
property.
+
-r) possesses a simila;
268 Ap pl ie d Pr ob ltm l ln Proboblllly Theory

If th e reahzaUon of a random functton Y (t) cros


"h1ch ts pa r 11leJ to th e t axJs an d Js ot the drstanee srs (u p" na rd s) a straJght lme
JS sa1d to ~ross the level a ("ee 1-Jg 9 0 5 1n '\\ hzeh a from 1t th en th e funct1on X (t)
th e Je \ el cro~sr.nr;s ar e marked l:Jy
crosses) ThP nu m be r of level cro'CI.~JDgs
x(t) X d ur1ng th e tim e T ls a dJscretE- random
'A rla b] e 1f leove] cro~'O;rng~ ar e Infrequent,
th en th~ var1a bl e ean be cons1d~red to
ha Vf a Po1sson dt ~tr1 button
Fo r a normal s ta tto na r) ra nd om ' an
abl~ Y (t} th e av er ag e number of ero
s
0 "tngs of the level a pe r un Jt ttme 1s
t
1 { (a -m t') J ) Cy
Fi g 9 0 5 Aa- -- 2n: exp - 20'~ az •
(9 0 42)
~here a11 ts the m ea n square de vt ah on of th e de nv at tv e
of th e random functton
y (t )= ~ y (t) (9 0 43)
The average nu m be r of doV.ll?. ard cro~slng's of th
e gl \e n le \e l 1s the ~arne

Probletns an d Exerclses
9 1 Cons1dcr1ng a no nr an do m funct1on of t1
m e cp (t) to be a special
kt nd of r1 nd om funct1on ~\ (t) = <p (t) , fin
d 1ts m ea n 'a lu e mx (t),
va ria nc e V ar :rc (t) an d th e co rr el at to n fu nc tto n
fu nc tto n A. (t) st at 1o na r} ? R x (t, t:t) Is the random
A ns "e r mx (t) ==: rp (t)~ Varx (t) = 0, R x (t, t)
= 0
In th e ge ne ra l ca se th e ra nd om fu nc tio n X (t)
IS no ns ta t1 on ar y s1nce
for rp (t) ~ ca ns t "e ha \-e m% (t) =fo. co ns t
9 2 Th e co nd iti on s ar e the sa m e as 1n the pr ec
ed tn g pr ob le m bu t
th1s t1 m e <p (t) = a = constt ''h er e a 1s a no nr
an do m va ri ab le X (t) =
a Is tl1e ra nd om fu nc tto n X (t) st at io na ry ? If tt
1S 1 th en do es 1t posse~s
an er go di c pr op er ty ?
A ns \\e r Th e ra nd om fu nc tio n X (t) 1s st at io na
ry st nc c mx (t) = a ===
co ns t, V ar r (t) = 0., R ~ (t, t ) = R~ ('t) = 0,
pr o pe rt} an d possesses an ergod1c
9 3 In ea ch se ct io n a ra nd om fu nc tio n X (t)
1s a co nt in uo us random
va rt ab le w ith a U ni va ria te d1 st rib ut 1o n de ns
1t y f (x, t) \V rit e th e
ex pr es st on s for th e m ea n \ al ue m~ (t) an d va
ra nd om 'a r1 ab le X (t) ri an ce Var.x (t) of th e
A ns w er
~
~

m;,; (t) = 5 xf (x, t) dx, V ar1 (t) = ~ {x -m x (t)]a f (x, t) dx


9 4 A ra nd om fu nc tto n X (t) IS a ra nd om va
rt ab le X (t) = V, w he re
V IS a co nt in uo us ra nd om va ri ab le w 1t h pr ob
ab ili ty de ns 1t y qJ (v)
(I ) ' ' r1 te th e ex pr es st on fo r th e un iv ar ia
te dt st r1 bu tto n (denq_1ty)
f (x, t) of th e ra nd om fu nc t1 on X (t), (2) fin d th e m ea
va ria nc e V ar x (t) of th e ra nd om fu nc t1 on X (t) n va lu e m x (t) an d
1
(3) 'Write th e express1on
Ch. 9. Random Functions 269

for the bivaria te probab ility distrib ution functio n F (x1 , x 2 , t1, t2)
of two section s X (t1 ), X (t 2 ) of the random functio n X (t); (4) find its
correlat ion functio n Rx (t, t')_and the spectra l density S~ (ro).
Solution.
(1) f (x,

t) = cp (x);
co 00

(2) mx(t)= J xcp(x)dx=mv, Varx( t)= ~ (x-mv ) 2 <p(x)dx;


-oo -oo

(3) F (x 10 x2 , ti, t 2) = P {X {ti) <xi, X (t 2) < x 2 } = P {T' < Xt. Y < x 2 }·


If xi <x 2 , then T1 <x1 implies that 1' <x2 and P{V <xi, V <x2 } =
:r,
) <p (x) dx. Conseq uently

-oo

( Xt

I -~ <p (x) dx = F (x 1) for


F(x 17 x 2 , t1, t 2)={
I
l
Jcp
-co
(x) dx = F (x 2 ) for

where F (x) = Jcp (x) dx is the distribution function of the variable Tl;
-oo
0 0 0 0

(4) Rx (t, t') = M [X (t) X (t')] = M [Vl'J = Varv,


Varx(t )= Rx(t, t')=Va rv.
Since mx (t) = const and Rx (t, t') = const, the random functio n
X (t) is station ary. Since the t averag e for each realiza tion is equal to
the value the random variabl e l' assume s in that realiza tion and differs
for differen t realiza tions, the random functio n X (t) is not ergodic;
its spectra l density '
co

Si (ro) = V~rv J e-iffi't dro = Varv cS (w),


-oo

where 6 (w) is the delta-f unction . \Ve can immed iately verify this since
co 00

Rx('t) =) S~(w)eiffi"tdw=) VarvcS (w)eiffi -rdw=V ar0 eiO-r=V ar0 •


-oo -co

9.5. A random functio n X (t) is given in the form X (t) = Vt b, +


where V is a random variabl e which has a normal distrib ution with
P?ramc ters nz v and a v; b is a nonran dom variabl e. Find the univar iate
distribu tion density f (x, t) of the section of the random functio n X (t)
and its charact erictics : mx (t), Varx (t), Rx (t, t').
2.70 AppUt4 Pro:~l1mr tn ProbD:btlUy Thtory

Answer. f (x, t) 1S a normal distrtbutton w1th parameters mflt +b.,


\t~atr•

f (~1 t) =
1
v-2n exp -
[ [.r-(ml't+b)J1
2tlol
J'
r t 1 a- o v
rnx- (t) ::=:mot+ b, \rar:c {t) = t a:, 2
R~ (t, t') =a~ tt' 4

9.6. Show that any function of tv.o arguments of the form


t\

~ Var, ~p 1 (t) ~~ (t•), (9 o)


t~t

\\here Var i are nonnegat 1' e numbers and tt'l (t) are any real functton.s
(t = 1 ~ 2,. 't n)., pos.ses~es all the properttes of a correlation !unction

Solution It IS suffictent to sho,v that there IS a .random function X (t)


wln.ch has a cortelat1on function (9 6) Let us cons1der a real tandom
funct1o n X {t) gt' en 1n the form of a canonical dec om pos1 tton
n
X (t) = m. (t) + l} l',rp, (t),
i~t

'\V here Var l V 1l :; ; :;: V ar 1 The correl a\1on funct1on ot th1s xandom funct1on
has the form
n

Rx (t t') ~ ~ \"ar, r.p 1 (t) Tt (t').


i:.zl

and that IS '"hat ''e \\ 1shed to pro\ e


9 7 ~~ G1vcn that the char ac terts tics of a normal st ocbast tc process X (t)
are mx (t) Vo.r x- (t)} R x (t, t') and that the ~tochast1c process ts tn a state
x (A (t) = .x) at a moment t't; f1nd the conditional probabll1ty Pc.a that
at a mQment t' > t the process X (t} ''111 belon~ \o a c-ettatn domrnn
(a, ~)
Pae == P {X (t') E (a. ~) I X (t) ~ x}
Solution. \\7e des1gnate X (t) = X 1 , X (t') = X 2 .. A system of tanrlom
vartables X 1, X 2 has a normal d1str1bu t1on f (x1 , x ~) w1 tb charac tertsttcs
m:t 1 = m.:t (t) t In r 3 = m:;e (t')t

O.rl
2
= ,.,
ar~
(t)
'
2,
Ux~ =
varx (t') ' r _ R% (tt t')
til'- a a
:Xa ;t:t

The cond 1tto na 1 d1s tr1bu tton


+oo
f(x,.\x!\=f(a:: 1,x2}{f,(x.})-t, ''here h{:rt)= ~ f{xn.r 2)dx,.

Th1s dtstrl bu tion IS also norma I \VI th c harac ter1st1cs

mx; I ;r:~; := m2 + r:t 2 Ozt


o.l'c.
(..r!- ml) t CS.:rr J .x, == O'xl v 1-r; 2
Ch. 9. Random Functions 271

Hence
II
J
Pu.~ = f (x2 I x 1 = x) dx 2 = dl
Cf.

-<D '
where <D (x) is the error function.
9.8. Find the univariate and bivariate distributions and the charac-
teristics of a random function X (t) defined by its canonical decompo-
sition
n
X (t) = mx (t) + i=1
2J Vicpi (t),
where Vi (i = 1, 2, ... , n) are crossuncorrelated normally distributed
random variables with characteristics mi = 0, Vari.
Solution. The univariate distribution f (x1 , t) is normal with charac-
teristics
n
mx, (t) = mx (t), Varx, (t) = 2J
i=1
Vari rp~ (t).

The correlation function


n n n
Rx(t, t')=M[l.j Yirpdt)
i=i
2J
j=1
11/Pi(t')]= 2J Varirpi{t)rpdt').
i=1

. The bivariate distribution f (x1 , x 2 , t, t') is normal with characteris-


hcsmx, (t), mx. (t'), Varx, (t), Varx. (t'), Rx (t, t'). The random function
X (t) is normal and, therefore, the bivariate distribution is an exhaus-
tive clwracteristic for any number of sections of the function.
9.9. Given a random function
X (t) = 11 1e-a,t 112e-a•t, +
where 1'1 and V2 are uncorrelated random variables with characteristics
fv• ~ m1,, = 0, Varu,, Varv •• find the characteristics of the random
unct10n X (t).
s.o~ution. The random function X (t) is given as a canonical decom-
position without a free term and, consequently, mx (t) = 0,
+
Rx (t, t') = Vart) 1 e-o:,(t+t') Varv, e-a,(t+t'>,
Varx (t) = Var11 , e-tt• 21 + Var 11 , e-a:. 21 •

. ~.10. A random function X (t) is defined by its canonical decompo-


Sthon
II

X(t)= ~ Yie-o:i 1 +a,


i=1
272 Applu!d Probltmt in Prt~babfltty Thtory

''here lr, nre centred r'lndom vartahles "1th variances Var~. (1 = 1,


2t ' n) 1-.I [ vh l'II = 0 for ' ~ ]., a lS a nonrandom varJal>le
Ftnd the character1~ttcs of the random funct1on X (t) ..
Answero~~
t\

mx ( t) = a, R ~ (t, t') = ~ Va r"1 e-a,u +t'J,


i=J
"1

Var.x (t) = ~ Var-p e- 2at1


i=1 l

9 f 141 A random funct1on X (t) ts defined by a canon teal decomposJhon


X (t) = t + l'1 cos rot + V 2 s1n (J)t,
"here 1'1 and 1'2 are u ncorrela ted rand om var1a bles WIth zero mean
values and var1nnces \'ar1 = Var 2 = 2 Is the random variable "t (t}
sta ttonary'
Solutton mx (t) = t R x (t, t ) = 2 (cos rot cos (J}t' s1n (l)t sin rut') = +
2 cos w (t- t)
The correlation funcllon of the random fun~tton X (t) satLSfi.es tbe
cond 1t Ion of sta t1onar1 t l hut the mean , a. I ue mx (t) depends on ttme
The random functton .A (t) ts nonslat1o.nary but the centred random
funct~on X (t) 1s stationary
9 12 Suppose '" e ha' e t" o random functions
X (t) = V1 cos w1t ..-1-. 1 2 s1n w1 t and Y (t) = U1 cos 6l 2 t U1 stn Cll2t~ +
The mean values of all the random var1ahles V1, Vt• U1 .and UJ are
zero, the variances are Varv. = Varv 1 = 1, Varu. = Varu. = 4 The
normed correlat1on matri"< of the system (l'1 • V2 , U1 , U,,) bas the form
1 0 0.5 0
1 0 -0.5
1 0 •
1
F1nd the crosscorrelat1on functton Rx11 (t, t') and the value of the
functton for t = 0, t = 1.
Solution.
Rxu (tt t') =
., .
l\! IX (t) Y (t')]
= l\1{(17 f cos oo 1t +V oo 1t) (U1coS w._t
2 Sill +
U'! Sln w2t')J
1
tu
=cos -ffit t co~ (J}2.t 1\l (l' t1 +cos OOt t Sin (()2..t AI (V lu21
1

+ +
~1n oo1 t cos (&)1t '~I (V 2 Ut J stn Cxl 1t .ran ffi 2 t~~r CV 2U2J
=cos rott cos ro,t'- sin w1t s1n ro 1 t' =cos ((I)lt w1 t'), +
R~ (0, 1) =cos hl~, Ru~ (0, 1) :=:cos co11
R 11 ~ (t, t') = R:t11 (t' 1 t) =cos (m1 t'-+ w~t}
Ch. 9. Random Functio ns 273

9.13. A Poisson process. '~Te conside r an elemen tary flow of events


with intensi ty 'A on the t-axis (points on the t-axis, see Fig. 9.13a) and
a related functio n X (t), which is the numbe r of events occurri ng during
the time (0, t). At the momen t an event occurs the random functio n

4
6
I
I
J..
~
0
I

I
I
'
I
I
I I
' I
I:

2 I 0 I I I I
' I
~ I I I I

0 r, Tz TJ T4 T5 7i; T7 t
(a)

t' Rx(t,t')

0
t t t I_ t t'
(h)
Var:z,
t~t'
(c)
rx (t, t ')
X{t}-;t(t )
6
5 ---- ---- -
t' 4
3 ----- --
2
--- 1•
--..-- --
(d)
Fig. 9.13
X (t) makes a jump, thereby its value increas es by unity (Fig. 9.13a)*>.
The.random functio n X (t) is known as a Poisson process.
Fmd the univar iate distrib ution of the Poisson process and its charac-
teristics mx (t), Varx (t), Rx (t, t'), rx (t, t') as well as the charac teristic s
of the stochas tic process Z (t) = X (t) - 'At.
Solution. The distrib ution of the section of X (t) is a Poisson dis-
tribution with parame ter a = 'At and, therefo re, the probab ility that
the random variabl e X (t) will assume the value m is express ed by the
formula Prn = ('At) 111 (rn!)- 1 e-1.t (m, = 0, 1, 2, ... ). The mean value
and varianc e of the random functio n X (t): mx (t) = Varx (t) = f..t.
Let us find the correla tion functio n Rx (t, t'). Assume that t' > t
?nd consider the time interva l (0, t') (see Fig. 9.13b). We partitio n the
lnterval into two portion s: from 0 to t and from t to t'. The numbe r of

•> We assume that tho function X (t) is continuous on tho left.


18-0576
274 Applted Prnble1111 ln ProhtJbfllly TA~ory

events O'er the" bole Interval {0, t') 1s equal to the sum of the numbers
1
of events on the Intervals (0, t) and (tt t )*) X (t') = X (t)
41
Y (t - t) +
'vbere Y (t' - t) 1s the number of events whtch occur In the Inter\al
(t, t') Since the flow IS stationary,. the random function Y (l - t)
has the same dJstributton as X (l) In a ddt t1on, accord1ng to the prop
erttes of a Poisson flo,v of events, the random variables X (t) and
Y (t - t) are uncorrelated
\\ e have

R x (t, t') = l\I 1x (t) x (t') 1= ni rX (t) (X (tl + Y (t' - t))l


= 1\1 (X (t))'t] = Var:c: (t) == 'At.
S1mllarly for t > t' "e get R x {t, t") = Af Thus Rx (t, t') =
min {At At } = A min {t, t' }, where min (t, t'} Is the smaller of the
values t, t (If t = t "e can take e1ther t or t• as the m1n1mal value)
Us1ng the notation of a untt funct1on 1 (x) (sea AppendiX 6) we
can 'vrtte the correlatton function as
R :~: (t, t ) = At 1 (t' - t) + 'At' 1 (t -
t').
F1g 9 13c 1llustrates the surface R:.: (t, t') In the quadrant t > 0
and t > 0 the surface Rx (t, t') conststs of two planes whtch pass through
the t and t' axes respectively and tntersect along the ltne 0-Var~tt
the appltcates of ~bose po1nts are equal to the variance Var~ =At
The normed c.orrel at 1on funet1on

rx(t t')= Rx(t t) =~/It(t -t)+ .. /I:I{t-t)


YVarx (t) Var.1: (t } V t V t

The surface r:r (t t ) IS sho\vn In Ftg 9 i3d


A Potsson process IS a process w1th 1ndependent Increments .s1nce
tts Increment at any tnterval 1s the number of events occurr1ng on that
1n terval, and for .an elementary flow the numbers of events falhng on
nonover1apptng Inter\ als are 1nde pendent
The process Z (t) = X (t) - At (see Fig 9 13e) results from a non
homogeneous l1near transformation of the stochastic process X (t)
Conseq~ently m: (t) == m:;r; (t) - 'At = O, Var: (t) = Varx (t) At, =
R: (t~ t ) = R:x (t, t ) == A mtn (t t 1 ) Since the corresponding homoge
1

nco us ltnear transforma t1o n does not change the correla tton funct1on
of the process X (t)
9 14 A stochastic process X (t) occurs as follows There IS a station...
ary (elementary) Poisson flow of events w1th 1ntens1ty A on the taxiS
The random functton X (t) alternately assumes values +1 and -1
Upon the occurrence of an event.,. 1t changes 1ts value JUmpwise from
+1 to ~1. or Vlte 'ersa (F1g 9 1qa) Intttally the random funct1on
X (t) may be e1ther +1 with probab1l1ty 1/2 or -1 'v1th the same prob

*) We neglect the po~.s1h1hty that an event m1ght occur exactly at the moment l
s1nc'& the, pto ba h1l it yoo
t b is oceurtenc e 1 s zero
Ch. 9. Random Functions 275

ability. Find the characteri stics mx (t), Varx (t) and Rx (t, t') of the
random function X (t).
Solution. Tlie section of the random function X (t) has a distributio n
represented by the ordered series

X (t)- -11 +1 . '

- 0.5, 0.5

Indeed, since the moments of sign changes are not connected with the
value of the random function, there is no reason for either +1 or -1
to be more probable. Hence mx (t) = -0.5 0.5 = 0, Varx (t) = +
(-1)2/2 + 12/2 = 1.

X(t) Rx(t,t')
-,t

To find the correlation function Rx (t, t'), we consider two arbitrary


sections of the random function, X (t) and X (t') (t' > t), and find the
mean value of their product:
0 0

Rx (t, t') = M [X (t) X (t')] = M [X (t) X (t')].


_The product X (t) X (t') is equal to -1 if an odd number of events
(sign changes) occurred between the points t and t', and to +1 if an
even number of sign changes (zero inclusive) occurred. The probabilit y
t~at an even number of sign changes will occurr during the time "t' =
t •! - t is
00

- ""' {f..'t)2m c-~-r- e-i.-r e'·-r+e-'·-r


Peyen - LJ (2m)! - 2 '
m=i

sdim~larly, the probabilit y that an odd number of sign changes will occur
unng the time "t' is

Podd = e-i.-r •
276 Applteri Probl~ms In Probabt ltty Thtory

Hence
R"- (t, t') === ( + 1) Pe"rtn + (-1) Podd = e- 2 ~-r,
wber~ -r = t'- t Stmtla rly, for t' < t
R:£ (t', t) = e-2~(-'l), 'vhere T = t'- t . .
Comb1n1ng these formul as, '\\ e get Rx (t, l') = R~ (-r);:;:: e-2~ l tl The
graph of th1s funct1on IS shown 1n FJg 9 i4b The surface R~(t~ t )=
e- ~A.~ t -t I IS sho,vn 1n F1g 9 14c
The random functio n X (t) ts station ary Its spectra l dens1ty
00

S:c• ( (r) J = 2n.


1 rJ R .x
( >
T C
_• {J) T d
T =n t 21
(2A.)'lL+ Cit'~ •
-oo

9 15 There 1s an elementar~ (statio nary Po1sson) flo\v of events


wtth Intensity A on the t ax1s A stocha stic process X (t) occurs as fol
lows "hen the ~;th event occurs tn
X(t) X(t) the flov. (' = 1, 2, ), tt assumes
J l a random value V 1 and retains that
' I X(t') 'alue ttl I the next event occurs lD
I .. T f
1 the flo" (F1g 9 15) At the 1D1hal
1 l - I ' "" I momen t X (0} = V 0 The random
" ! l \. J vartabl es Vo, VI' v2,
... , v.,
0 t I T t t 1~ t are Indepe ndent and have the same
-~1 I I distr1but1on w1th denstty <p (.x) Find
LJ the charact er1sttc s of the process.
rlg 9 Ia VIZ mx (t), Varx (t), Rx (t, t') Is
the process s ta ttonary ?
Solutto n Any sect ton of the random funct1on X (t) has a dtstribut1on
rp (x), hence
00

m:t(t) =M[V d= Jxq>(x)d.r=m~,


-oo
~

Var:~:(t)=Var.,= J(x-m )2q>(x)dx 11


-oo
\\- e find the correla tton func tton R :~; (t, t') by the same techntq ue v..,e
used 1n Proble m 9 14 Let us consid er two sectton s X (t) and X (t)
(t' > t) dtvtded by the Interva l ~ = t' -- t 'Ve have
Rx (t, t') =
• •
M [X (t) X (t')J.
Cll 0

If no events occurred betwee n the potnt.s t and t~, then X (t) =X (t)
and Rx (t. t')-=== l\l[(i (t))2] = Var-2: (t) == Var» If at least one event
occurre d between the potnts t and t', then 1\I[.i( t).i(t') J=O Hence
Rx (t, t') = e-~t Var" + [1-e-l -r] 0 = Varv e-A't'.
Ch. 9. Random Functions 277
-
Similarly, for t' < t we have
Rx (t, t') = Varv e-M-'t>,
whence it can be seen that the process is stationary . Its correlatio n
function Rx ('t) = Var v e-h\'t\ does not depend on the form of dis-
tribution <p (x) and only depends on its variance ': ar v· .
'I
9.16. A process with independent sections. We con~1der t~e ~t?chastiC
process X (t), described in the preceding problem, ~n t~e hm1tlng. case
when the intensity A of the flow of events tends to mfimty. Investigat e
the behaviour of the characteri stics of the process as A > oo.

Rx('t)

0
(b)
Fig. 9.16

Solution. When A increases, the correlation function Rx ('r) =


Varv e-1.\'tl converges to the origin (Fig. 9.16a). In the limit we get
a correlation function of the form
"' (•) =lim Rx (•) =lim Varve-hI 't I= Varv lime-hI 't I,
Rx
i.->co h-><X> ~.... oo

i.e. a function which is equal to zero everywher e except for 'f = 0,


and equal to Varv for 't" = 0 (Fig. 9.16b). We can write this as
Varv for 't=O,
0 for 't" =F 0.
Since by the hypothesis of Problem 9.15 all the sections of the sto-
chastic process X (t) are independe nt, we have obtained a model of a
stochastic process for which any two arbitrarily close sections are in-
dependent. We can call this "a process with independe nt sections".
~ s~o.chastic process with independe nt sections X (t) we have in the
hm1tmg case when A > oo does not possess any points of discontinu ity.
9.17. White noise. Let us consider a limiting case for the stochastic
process X (t), presented in Problem 9.15, under the condition that
the. intensity of the flow A tends to infinity and at the same time the
Y~nance Var v of each section tends to infinity with V ar viA = c = const.
Fmd the characteri stics of this resulting stochastic process Z (t) and
show that the random function Z (t) is stationary white noise.
XSolution. We investigat ed the behaviour of the stochastic process
· d (t_) .ns A -+ oo .ir: Problem 9.16; we shall now take into account the
a dlhonnl conditions Var 0 -+ oo, Var 0 /l. = c. Let us consider the
278 Appli.td Pr ob lem 1 In Pr ob ab lll tg Thearu

sp ec tra l de ns 1t y S x ((!)) for th e sto ch as t1 c process X


(t) of Problem 915
Varu ).. Va r 0 A-1
S.:r (ru) == :t i..l+m' = . n£ A' +m a
From th e co nd ttr on Var0 /'A =c w e have V ar ,./ (n A )= cl
The co rre la tto n functton Rx ('t) = anAe "-I'' 1 .rt =a =c on st
Fo r th e sto ch as ttc process Z (t) w ht ch results as A-)=
oo Var~ --i~- oo
Yare/} = c l\O ge t

R , (') = ltm R~ (i ) == ltm an,e-11 'It~ an l1m }e-'-1"1 =a n6 (t)


l .... oo )..-..oo ).. ... ao

\v he re 6 ('t) ts the de lta functton (see Ap pe nd 1x 6)


a)..1
S, (ro) = l1m A•+ , == a
).. .. cro 00

\\ e ha ve th us ver1fled th at th e st oc ha st ic process
Z (t) ts stat1onary
wh tte no tse an d constructed a m od el fo r tts ap pe
ar an ce 'V ht te noiSe
ca n be regarded as a ltm 1t 1n g ca se of a se qu en ce of
sh or t Independent
Sl nt lla rly dt st rtb ut ed pu lse s v.1th la rg e vartanc
e In pract1ce we en
co un te r these processes wh en 've cons1der na tu ra
l Interference such as
th er m al no ise 1n el ec tro nt c devtces sh ot effect.
an d so on
9 18 \Ve cons1der th e sto ch as tic process X (t) de sc rib
9 15 un de r th e co nd 1t to n th at th e di st rib ut io n of ea ed 1n Problem
ch ra nd om vartable
V f (t = 0 1 2 ) IS no rm al 'v1 th m ea n va lu e m , = 0 an d var1anee
V ar v Ft nd th e cb ar ac te rts ttc s mx. (t) Var% (t) an
sto ch as ttc pr oc es s X (t) Is th e process st at to na ry ?
d R: c (t t) of the
So lu tio n An y se ct io n of th e ra nd om fu nc tio n X
Is 1t no rm al?
(t) 1s no rm al ly dts
trt bu ted \V lth pa ra m et er s m ~ o v = lf V ar r:P It follo
of Pr ob le m 9 15 th at th e co rre la t1 on fu nc tto n
ws fro m th e solution
fh e pr oc es s 1s sta t1 on ar y B ut ts rt no rm al ? No 1t
Rx (1') - VarfJ e ).Itt
1S no t desp1te the
fa ct th at th e un tv ar ta te drstr1bUt1on 1s no rm al Th
e JOlnt dtstrtbutlOD
( f t\vo se ct to ns of th e sto cb as t1 c pr oc es s X (t) IS
no lo ng er no rm al s1nce
th e se ct to ns co tn c1 de '' tth no nz er o pr ob ab tlt ty 'vh
1c
case tf tw o random va ria bl es ha ve a JOint no rm al h ca nn ot be the
distr1but1on
9 19 A model of a flu x of electrons in a ra dt o valve
A flux of electrons
di re ct ed from th e ca th od e to th e an od e of a ra dt o va
lv e 1s an elementary
fl. or.\ "1 th 1ntens1ty A \V he n an el ec tro n 1s ab so rb ed
by th e an od e the
"o lta ge of th e la tte r In cr ea se s JU m p, 1se by un tty
an d th en beg1ns to
decrease ex po ne nt ia lly '' tth pa ra m et er a \v ht ch de pe
nd s on th e ch ar ac
tertst1cs of th e el ec tro nt c c1rcut t (F1g 9 19a) Th
e In cr em en t of the
vo lta ge 'vh1ch ts du e to th e ar r1 va l of an el ec tro n
1s su m m ed up \Vlth
th e re sid ua l an od e vo lta ge F1nd th e ch ar ac te rts
t1 cs of th e stoebast1c
pr oc es s X (t) 1 e th e nnode vo lta ge
So lu ho n El ec tro ns ar rtv e at th e an od e at
ra nd om moments
T1 T 2 T1 \Vbtch fo rm an el em en ta ry flow of ev en ts Th e
va l ta ge at m om en t t '' h1ch 1s du e to th e ac tio n
of th e tth ele ctr on
Ch. 9. Random Functions 279

arriving at the moment T h is


0 for
e -a(t-T.)z for
where 1 (t) is a unit function; T 1 > 0, t > 0. Let us co:r:s~der a ran-
dom variable Y, which is the number of electrons arnvmg at the
anode at time t. This variable has a Poisson distribution with

X(t)

1
1 - ;r I

I
I
I
I
I
I
I
I
I
I I
t
0 T; Tz TJ Tt, Ts Ts T7 ~
(a)

I-< Bz,.,
e, J •
l
0 I t
!J .
t

f 2 el
(b)
Fig. 9.19

parameter 'At. We represent the voltage X (t) as the sum of a random


number of random terms:
y
X (t) = 2J e-a(t-T1>1 (t-Ti)· (9.19.1)
i=1
We have shown somewhat earlier (see Problem 8.80) that a Poisson
flow of events on the interval (0, t) can be represented, with sufficient
accuracy, as a collection of points on that interval, the coordinate of
e~ch of which ei E (0, t) is uniformly distributed on that interval (see
F1g. 9.19b) and does not depend on the coordinates of the other points.
Consequently, expression (9.19.1) can be rewrilten in the form
y
X (t) = 2J e-a(t-ei>, (9.19.2)
i=1
wlwre.the random variables 0, are independent and uniformly distrib-
uted in the interval (0, t).
We designate X 1 (t) = e-a(t-el) = e-ateaei, and then
y y
X (t) = 2j Xi (t) =e-at 2j eaei. (9.19.3)
i=1 i=1
where X 1 (t} are independent similarly distributed random variables
and _the random variable Y does not depend on the random variable~
280 Appll~d Problems in Probability Theory

X 1 (t) e1ther In accordanc e with the ~olut1on of Problem 7 64 v. c \\7 te


the e~pre~s1on~ for mx (t) nnd Var x (t)
m:1: (t) = m 11 (t) mx 1 (t) (9 19 4)
Varx (t) ~ mv (t) Varx 1 (t) Varu (t) m~ 1 (t) +
(919 5)
S1nca tl•e random varHlble } has a Po1sson d1str1butJon \vlth para
meter At 1t fo11o,vs that m 0 (t} = Var u (t) = Let us find m.r 1 (t) "-t
r
'
m~ ~ (t) = 1\ l (X 1 ( t)] = T J e
1
au ~) dx =
t-e
at
at

\.ext '\ e det erm tne the c:econ d moment :1 bout the or1g1n of the random
var1able Y1 (t)
t
l\1 rxr (t)] = f\0
(e a(f z ]2 dx =
i-e 2a;t
---=----
2cxt

Con ~eqnen tl l
1 o:f
mx(t)-) -o (9 19 6)
a

Varx (t) = J\t l Varx {t) + m;i (t)}- At'llXF (t))-) f


-;a 2at
(9 19 1)

l\ ote that as t -... co the mean value and v arta.nce of the pr£K~S X (t)
do not depend on t1me lun mx (l) = mx - A./a lim Varx (t) --
t-oo
Varx = Al{2a) ' ...,.II)C

To f1nd the distr1bUl1on of tho section of the stochastiC proce~s X (t)


for m:.t - 'A.frz > 20 \\e :rearon as follow~ Cons1der1 ng a fintte but
suffictentl y large Interval (0 t) and assuming that a sufficientl y large
number Y of electron emiSSions t1..he plD.ce on that 1nter' al "e '=ee
that the process X (t) Isee formula (9 19 2)1 1s a sum of tndepende nt
.stmllarly distribute d random var1ables ,,htch has an appro....:tmately
norma.l dlstrl.butlo n s1nce 1n thts Ca"-e the cond1t 1ons of tho centralltm it
theorem are In fact ruifi11ed (sec Chapter 8) Consequen t!} the sectiOD
of the stochastic proceco:s has a normal dtstrtbut 1on w1th characteriS ticS
n1:r = 'Ala and Var::t == A/(2a) The process ,v1Il practically become
sta t1onary In the t 1me "Tst = 3/a
To f1nd tbe correlatio n functions let us cons1der t\V'O secttons ol the
stochasttc process In questton at moments t and t (t > t) By vtrtue
Gf the assumptio ns made \\e can assert that the- anode voltage 1. (t )
of the 'alve at the moment t 1s equal to the voltage X (t) at the moment
t multiplied bl the e"tponent c o:o t) plus the voltage Y (t - t)
"'\Vh tch results I rom the arr1 val of electrons at tbe anode 1n the t1 me
1nter' a1 (t t )
(9 19 8}
Ch. 9. Random Functio ns 28i

The stochas tic process es X (t) and Y (t' - t) are eviden tly indepe ndent
since they are genera ted by electro ns arrivin g at the anode during
different, nonove rlappin g time interva ls (0, t) and (t, t') respec tively.
0 0

The same can be said of centred stochas tic process es X (t) andY (t' - t).
Conseq uently,
0 0

+ 0

Rx (t, t') = M (X (t){X (t) e-a(t'- t> Y (t'- t)}]


t' > t,
0
= M [(X (t))Z] e-a{t'-t ) for
0
Rx(t, t')=M [(X(t' )) 2 ]e-a(t- t') for t>t'.
Combining the last two express ions, we obtain
Rx (t, t') = Varx (min (t, t')) (1- e2amin {t, t')] e-al t'-tf.
Let us conside r the limitin g behavi our of the stochas tic process when
t- oo, t' -+ oo, but the value of their differen ce 't = t' - tis finite.
In this case
A e-al-rl.
R X ('t)=V ar X e-al-rl = 21X
Thus the stochas tic process X (t) we conside r in this problem is
stationa ry and practic ally normal for t tending to infinity and 'Ala > 20.
This problem is a more genera l case of Proble m 9.13. Indeed , as
a -0, the anode voltage of the valve is a Poisson process since with
the arrival of each new electro n the voltage only increas es by unity
and does not decreas e with time. Conseq uently, the followi ng equalit ies
must hold for anv• finite t and t':
. 1 _ 8 -at . . 1_ 8 -2at
hm mx (t) =lim/ , = hm Varx (t) = hm /, =At,
a-o ;:a-o IX a-o a-o 2IX

limRx( t, t')=lim A [1-e-2 amin( t,t'>]e -alt'-ti =Amin (t, t').


a-o a-o 21X
The reader is invited to verify their validit y.
9.20. The functioning of a linear detector. Under the conditi ons of
Problem 9.19, we assume that electro ns arrive at the anode in "bursts ",
~he m?men ts of arrival of the bursts formin g an elemen tary flow with
mtens1ty A. The numbe r of electro ns in the ith burst is a random variabl e-
ll'i which is indepe ndent of the numbe r of electro ns in the other bursts
and h?s a distrib ution F (w) with charac teristic s mw, Varw. This prob-
lem IS equiva lent to that of defmin g the output voltage of a linear-
~elector, when positiY e pulses of the random variabl e (of the highes t
\oltage) TVi arrive at its input at random momen ts defined by the Pois-
s~n flow,. and in the interva ls betwee n the pulses the voltage drops
expone ntially (Fig. 9.20). Find the charact eristics of the process .
Solutio n. 'Ve can represe nt this process by a formul a analogo us to.
(9.19.2):
y

- L~
J ll1 i
v
.ll.
(t)- e-a(t-B .)
17 (9.20)
i=i
282 Appli~d Pro blem a tn Pro bab thty The ory

\Vhere the ran dom \3t tab les Y, lV,, e, are mu tua lly Independent
'\Ve des tgn ate Xt (t): :: lVle-~u-e1>, and the n
i -e -o:t
lt[ rx~ (t)] === mw 1_ 9 -2a t
at ' AICXl(t)]=(Varw+m~) at
2
Co nse que ntly ,
1-e -cd 1-e -2a t
m% (t)= Amw t Var.:~:(t)=1{Varw+m:,) a
tt
2
As t~oo, we hn\ e
l1m mx (t) = mx = Mn w , ltm Var~ (t) = Var~ = A. (Var +m 1
2ua w , )
t-.Ol l a. t . . . oo
Rx {'t) = Va rx e-o: r""'
Let us con s1d er the l1m 1ttn g beh avi our of the process X (t) \vhen the
foll ow1 ng quant1t1es Increase 1nd efin ttel y the tnt ens tty of the Po1sson
flow gen era t1n g the pulses
X{t) (~ -+ oo), the var 1an ce of the
am pl1 tud e of eac h pulse
(Varw ~ oo) and the param
ete r a (a; ... ~ oo) An tnfintte
I tncrease of the qua nti ty a
r s1gn1fies tha t the pulse voltage
0 dro ps rap tdl y, 1 e 1n the l1m1tt
t as a )oa oo, the are a of the
rig 9 20 pul se ten ds to zer o. the speeds
at \Vhtch the quant1t1es ~ and
Va r10 tnc rea se be1ng propor
tto nal to the speed of Increase of the qua ntt ty a A = k a, Varw
k 2a. \Ve obt aln (as t ~ oo) 1 =

ltm mx = ltm - mw = ktmw,


A
1 a ... oc 1 tt-+o o rx.
l1m ~ (\'arw+m:C,)
Va rx= l1m 2~ --+-) OOt
).. a Var to .. oo I.., a Var w-o o

hm Rx (-r) = ltm k~, ae- a I 'I' I= kjk26 (T),


A. a, Varw . . . oo ). 1 a Var w~OQ
\vh ere rS (t) IS the del ta fun ctio n
Thu s, In the l1m1t \Ve hav e 'vh tte not se, \Vhtch res ults fro m an tnfinlte-
ly fre que nt sertes of pulses 'vh tch hav e a fin1 te e'tp ect at1 on of the aro
plttude of a pul se and an 1r:.fin1 te var 1an ce of tha t am plt tud e, as 'veil
as an tnfi nitc s1m al dur atto n of the pul se Itse lf
9 21. The shot effect Let us con s1d er a sto cha stic pro ces s X {t) gener-
ate d by a Pot sso n pro ces s ltke the case tn Pro ble m 9 13 As the 1th
eve nt of the Poi sso n flo\v occ urs at a mo me nt T h a non neg atP te vol tag e
pul se J:Vi ar1ses 1n the ele ctri C ctr cut t, ,vhose ran dom val ue (am plit ude )
the n var1es acc ord tng to one and the sa me l "'v cp (rt) . '\vhere the var 1ab le
Ch. 9. Random Functio ns 283

11 is reckone d from the momen t Ti (Fig. 9.21) (<p (0) ~ <p ('11)). The ra_n-
dom variabl es Wi are mutua lly indepe ndent and have the same dis-
tributio n functio n F (w). The voltage in the circuit is the total effect
of all the pulses with due regard for their variati on in time.
The stochas tic process being conside red is known as the shot effect
(shot noise). The stocha stic process es investi gated in problem s 9.19
and 9.20 are special cases of the shot effect. We must find the charac-
teristics of shot noise.
Solution. In accorda nce with the solu- ~(lJ)
tion of Problem 9.20 shot noise on the 1
interva l (0, t) can be represe nted, as in
(9.20), in the form
y

X(t)= Lj Wi<p( t-8i), (9.21)


i=t n ?)

where Y, Wi, ei are mutual ly indepe n- Fig. 9.21


1lent random variabl es, and, by analog y
with Problem 9.20, the random variabl e Y has a Poisson distrib ution
with parame ter 'At and the random variabl e ei is uniform ly distrib -
uted in the interva l (0, t).
We designa te Xi(t)= Wi<p (t-8i) and then M[Xd t)]=m wt- 1<D(t),
t

where mw=M [Wd, cD(t)= ~ <p(t-x )dx is the area of a pulse (the
0
area bounde d by the curve <p ('11) and the coordin ate axes):
"'
M [Xl (t)] = (Varw + m~) t- 1<D (t),
t t

where Varw= Var[W tl=) (w-mw ) 2 dF(w), aJ{t)= ~ [<p(t- ;)]2dx .


0 0
Conseq uently, mx(t)= ilmw<D (t), Varx(t)='A(Varw+m~)$(t). When
t-r oo and there exist improp er integra ls,
00 00

lim <D (t) = \ <p (t- x) dx = <D,


..
lim iiJ (t) = \ <p 2 (t - x) dx = <D,
t-.oo t -+00 ..
0 0
"'
mx = 'Amw<D, Varx ='A (Varw + m~) <D
In practic e these improp er integra ls always exist since the area of
a pulse with an initial unit amplit ude and the area defined by the
..sq~ar~ of the respect ive functio n are finite quantit ies in practic al ap-
phcatw ns .
. Reason ing as we did when derivin g formul a (9.19.8) we obtain
X (l') = X (t) <p (t' - t) Y (t' - t) for t' > t. Conseq~enlly
+
Varx (t) <p (t'- t)
, for t' t,>
Rx(t, t)= Varx( t')<p(t -t') for t>t'.
284 Applied Probl~m' In Proba.bflitp Theory

Comb1n1ng the~o t'\ o e"tprcssion s. '" e find that


R:t (t~ t') = V a.r:t (mtn (t, t')) ~ (t t' - t 1),
As t, f -+ oo, \\0 get R X ("t) = Var% 'P ( r "[ I) (T === t' - t) Thus
'\\c ctee tnat as t t' -lJo. oo, t'~c no1se ellect 1s a sta\1onary s\ocbas\\C
process
Let t1s consider the IImtttng caso \\hen the Inlenstty A of a flo\V ot
pu]~cs tenrls to •nfin1tl, the \ .nr1anco Varw of the pulse also 1ncreases.
Indefinitel y, and the area of a pulse 'vtth un1t amplitude ID tends to
.....
zero, so tha l the varia b1e tl> also tends to zero Then the following con-
dItIons are fulfilled
,...,
ltm A.m .}1)-=- m, 1tm "-(Varw+m-~)¢l= Var.
A...... oo A Varw .... oc
c:n ..... o ~ .. o

In that ca~o shot notsc turns 1nto \Vhltc no1so Wl th characterLSt(cs


nlx === nl, R% (r) == Var 6 (t'),. ''here 6 (l') 1s tho delta function
9 22 Impulse nozse Consider sl1ot notse generated by rectangula r
pul~es The amplitude IV 1 and the duration x 1 of a. pulse arr1 vtng at
the moment T i are Independe nt random 'aru1blcs \\ tth characteristiCS
mu.. Var u., m~ ~nd Varx respect1 \ely (F1g 9 22) F1nd the mean value
and 'nr1ance of th1s stochast1c process X (t)
j l So\ution 1'he tm pu\se no tse -cons1dered on
ll

the Interval (Ot t) can be represente d as


~ y y
x,
,•
L

..J'
X (t) = Lj JV,rp (t, 9 1, n,) == 2j X, (t),
-.:~t i=l

f ''here tp (t, ah XI) lS a pul!le 'Vhlch begins at


the moment 0i and ,vhose hetght IS untty and
duration 1s x"' (the random 'a.r1ables Bt and "''
are Independe nt) (p (t a It x,) = 1 (t - e I) 1 (91 ~~ t) The + -
random variable 8, lS Uniformly d1str1butod Within the Interval (0, t)
Let us fi.nd 11 hp (t, 0" x,)} \Ve advance a hypothesiS that the random
varu1ble ~, assuine~ the value x If th1s hypothesi s ts satt$fied, ,,e find
the cond 1ttonal r"'\: pecta t I on for sufficten t I y I arge values of t

~t,[tp{t, e,. x)]=+ f (t-y) 1 (y+x-t) dy.

The tntegra.nd IS a rectangula r pulse 'v1th un1t height and durat1on x,


and the 1n t egr al of that fu net ton 1~ equal to the aroa of the pulse Con
sequently,
J\1~ lcp (t, e, x)] := '1(/t,
'vhence
Ch. 9. Random Functions 285

Consequently
M [Xi (t)] = M [Wicp (t, ei, xi)l = mwmxlt,
M [Xi(t)J = M [(Wicp (t, ei, xi)) 2 ] = (Varw + m~) mxlt,
M [X (t)] = mx = M [YJ M [Xi (t)] = l.tmwmxlt = f.mwm,, (9.22.1)
Var [X (t)] = Varx = M [Y] M [X~ (t)l = A (Varw + m~) my_.
(9.22.2)
In the limit, when the intensity of an elementary flow and the vari-
ance of the amplitude of the pulse Varw increase indefinitely, and the
mean value mx and the variance Varx of the duration of the pulse
decrease indefinitely, while the quantity Varx = A (Varw + ~) mx
remains constant, the impulse noise turns
into white noise with characteristics mx,
Rx (-r) = Varx 6 (-r). )l
9.23. Change in the quantity of homogeneous
elements. Consider a process X (t), i.e. the w ~

it
number of homogeneous elements functioning
at a moment t. \Ve assume that each element
functions for a certain random time x, which T t' t
has an exponential distribution with param-
Fig. 9.23
eter ~t, similar for all the elements, and
then fails ("decays"). The beginning of the
functioning (beginning of "life time") of each element is accidental
and is defined by an elementary flow with intensity /.. For example,
X (t) is the number of computers used, the intensity A is the number of
computers produced per unit time, x is the random life time of a com-
puter (the average time a computer operates is 1/f.L). Find the charac-
teristics of the stochastic process X (t).
Solution. The stochastic process X (t) is an impulse noise like the
one considered in the preceding problem, the only difference being
that TVi = 1 for any values of i (mw = 1, Varw = 0), and the random
variable x has an exponential distribution with parameter f.L· Con-
sequently, for sufftciently large t [see formulas (9.22.1), (9.22.2)1
niiX (t)] = l.mwmr. = IJf.L; Var [X (t)] = I. (Varw +
m;h) mw = l.f.L.
We can prove that the univariate distribution of the stochastic
process X (t) is a Poisson distribution with the characteristics obtained.
~Vhon seeking the correlation function of the stochastic process X (t)
bemg considered, we can use a property of the random variable x
which is the duration of the pulse and has an exponential distribution:
The property is that the "remainder" of the pulse duration x(Fig. 9.23)
286 Applied Pro lJ l ems In Pro ba blll t y T h 1 ory

reckoned not from tts begtnn1ng T but from a certain moment t1


(T < t 1 < T + x) also h.ns an e.xponenttal dtstrJhution wtth parameter
~ (see problem 5 35)
\ '7e advance a hypothests tb1t tho stochastic process X {t) = n
(n = 0, 1, 2, ) \Ve designate the probabilttY that the hypothesis
IS true as P n (t) = P (X (t) = n) Assuming that the hypothesiS IS
true, 've can 'vr1te an e~pres:non for the random funct1on Xn (t)
(t' > t)
n
Xn (t') = LJ
i=-0
V, + Y (t' --t),
where V 1 1s a random var1.able 'vh1-ch has an otdered ser1.es
0 f t

Va = 0, Y (t - t) 1s a stochastic process generated by the e'\ ents


occurr1ng 1n the Poisson flo'v 1n tho t1me Interval (t, t') The variable
n
~ Va 1s the number of elements ren1ain1ng at the moment t' 1f they
..-=0
'\\ere n tn number at the moment t
Let us find the con d 1t1 on al expecta t1on of tlle product X (t). X (t )
provided that Y (t) n =
yt

!\I [nX n (t )]-= nl\l(X "- (t')J ~ nl\l


1
t/J
-~.~1
V, + Y (t- t')J
= n {ne-~ u - u ...t.. ?\.f (Y (t 1 - t)]} == n2e-JJ u -t) + n"-m~
=
where m, 1/J.t Consequently the uncond 1t1onal ex pee ta tton of the
product X (t) X (t') ts
00.

M (X (t) X (t;)] ~ ~ l\1 [nXn (t~)] Pn (t)


t~~G

oo eo

= }J n~e-JL U'-tlp11 (t)


n~o
+ LJ n~o
nAm~P n (t)
= ~1 (XZ (t)J e-~ c~•-t) + rn~
Hence

or
Rx (t, t} =for t > t'
Var~e-P.. <t-t l
Comhlntng these two formulas, we get (t'- t'- t)
Rx (1') = Var X e-~' f "J = "'Jl e-JS. J 'f I
Ch. 9. Random Functions 287

For A./fL > 20 the process of accumulation of homogeneous elements


can be considered to be practically normal with characteristics mx =
'Aht and Rx (,;) = A.fL -Ie-Jtl'tl. .
9.24. One-dimensional random walk of a particle. Let us consider, on
the x-axis, a particle which changes its position jumpwise (walks) due-
to random collisions with other particles (Fig. 9.24a). At the initial
moment the particle is at the origin, at the moment T1 of the first

X(t)
Xs
Xs
I
X, x4 I

t Xz
Tz TJ 74 Ts To t
I
XJ (fi}

Fig. 9.24

collision it jumps into a point X 1 , at the moment T 2 of the second:


collision .it jumps and changes its abscissa by the value X 2 , at the-
moment T 3 it changes its abscissa by the value X 3 and so on. The
random variables xl, x2, ... , xil ... are independent and have the-
same distribution with mx = 0 and Varx = Var. The moment
Tr, T 2 , • • • , Ti. ... at which collisions occur form an elementary
flow of events with intensity A..
We consider a stochastic process X (t), the abscissa of a point walking-
at random as a function of time (Fig. 9.24b). (The values of X (t) may
be both positive and negative.) The process X (t) is a simplified model
of the Brownian movement of a particle. Find the characteristics of
the stochastic process X (t).
Solution. The number of events on the interval (0, t) is a random
vm:iable Y which has a Poisson distribution with parameter At. The-
value of the process X (t) at the moment t is defined by the formula
y
X (t) = LJ
i=1
X 1, i.e. is the sum of a random number of random terms;
~n this case all the random variables X 1 andL Y are independent. As-
111 Problem 9.20 we have mx (t) = Ivi [YJ M [Xd = 0, since M [YJ =
'At, 1\I [XJ=mx = 0,

By analogy with the solution of Problem 9.20, for a sufficiently large t


th~ se~tion of the random function X (t) has a· practically normal dis-
tributiOn with the parameters obtained.
'288 .tip pllt!d Problema In Pro ba btllt y T htory

\Ve seek the corrclatton funct 1on of the process X (t), for 'vbtcb
purpose \\e consader t'' o sections X (t) and X (t') (t' > t) It 1s ev1dent
that X (t') = X (t) + Y (t' - t) }Jere, as in Problem 9 20, the sto ..
z
chasttc process 1" (t'- t) = LJ
i~l
X e~ where Z ts the number of events
occnrr1ng 1n the flo\\ on the ttme Interval t• ~ t, 'Ve have tndtcated
that the stoch 'l.Sllc processes X (t), X (t') and Y (t- t') have a zero
e'\pectatJon Consequently, for t > t '' e ha\ e
R!l (t t ) - !\I [X ( t) X ( t ') 1= ~ 1 (X ( t) {X (t) + Y (tl - t)} 1
= 1\1 [X2 (t) J+ 1\l [X (t) Y (t' - t) I
= \'ar1 (t) + 1\[ [X (t)}l\f [ Y (t'- t) 1= \rar~ (t).
For t < t "e get R"' (t, t ) = Var.:.: (t') Thus
1
Rx (t, t )-=: Var A m1n (t t')
'The stocha.sttc process X (t) '' n ha' e constdered IS a process wtth
Independent 1ncrements
Remark A one dtmen.stonal random "a)k of a part1cle has the .same charac·
ter1shes as the proc~s.s Y (t) = X (t) --At, "here X (t) IS a Potsson process desp1te
the fact that thetr re1hzattons d1.tler considerably (compare- Flg 9 t3e and F1g
~ 24b)

9.25, The lVzener process 'Ve constder the ltmtttng hehav1our of a


~andom process X (t), ''htch 1s a one dtmons1onal random "alk of a
particle (see Problem 9 24) for an tnfin1te 1ncrea.so In the 1ntens1ty "'
.of the flo\v of colllstons and stmultaneous rnfintte decrease 1n the 'ari-
a nee Var of the displacement of the particle, In that case the condition
l Var = a = canst 1s sa ttsfie d Sho\v that 1n thts IImt ttng case "e
have a \Vtener process for time 1ntervals suffictently dtstant from the
or1gtn
Solutton. The condttlon mx (l) == 0 follo,vs from the hypothesis of
the preceding problem \Ve have also sho,vn there that the Increments
.are Independent and the process ts normal as t ,_ oo.
Var [X (t1 ) - X (t 2 )] :== rtl {(X (t1 ) - X {t 2 ))~l Stnce mx = 0,
1\I f (X (tt) - X (t2)) 2J =:;;: 1\I f X 2 (t1)
+ X 2 (t
2X (tl) X (t2)l
2) -

= VarIX (t1)J + Var IX


(t~)] - 2R:l (t1, t,)
It 'vas sho,vn 1n the preced 1ng problem that Var [J:1[ (t)] = AVar t,
Rx (t 1t t2) =).. Var m1n{t1, t 2} Hence for tt > t 1
VarIX (t1) -X (t 2)] =A \rdr t 1 + h Var t 2-2h Var mtn {t 1 • t2}
= 1w Var t 1+A, Vnr t 2-2 h Va-r tc=~A (t:a.- t1),
'\vhenco 1t foil ows that "e ha' e a \V rener process
9.26. A random funct1on X (t) 1s constructed as follo,vs
Ch. 9. Random Functions 289

At the point t = 0 iL assumes at random one of the values +1 or -1


with equal probabilit y 1/2 and remains constant until t = 1. At the
point t = 1 it again assumes one of the values +1 or -1 with the same
probability 1/2 and independe ntly of the value it has assumed on the
preceding interval. This value is retained until the next integral-v alued
point t = 2, and so on. In general, the function X (t) is constant on
anv interval from n to n + 1, where n is a natural number, and as-
su{nes one of the values +1 or -1 with probabilit y 1/2 on the boundary
of each new interval independe ntly of the preceding values (one of the

X(t) Rx(t, t')


1 1
I I
ltl 'C'
l
-?s t
0 2 3 t 4 t'15 t
1~ I I I I

(a, (b) t=t'


Fig. 9.26

possible realization s of the random function X (t) is shown in Fig. 9.26a).


Find the characteri stics of the random function X (t), viz. the mean
value, the variance and the correlation function. Determine whether
the random function X (t) is stationary .
Solution.
1 1
m:c (t)=m.-.: =(-1)--z+ i--z=O,
1 1
Varx (t) = Var.'l: = ( -1) 2 +12 --z=1.
2
We find the correlation function Rx (t, t'). If the points t and t'
belong to the same interval (n, n + 1), where n is an integer, then
Rx (t, t') = Varx = 1, otherwise Rx (t, t') = 0. vVe can write this
result in a more concise form if we denote the fractional part of the
number t by b (t) (see Fig. 9.26a). Then we get (,; = t' - t)
R , 1 for 1-rl < 1-b (min {t. t'}),
X (f • f ) =
0 for j-r[>1-b (min{t,t '}), where -r=t'-t .
. This function depends not only on -r = t' - t but also on where the
~nlen·al (t. t') is on the t-axis; consequen tly, the random function X (t)
Is nonstntion an·.
Th~ surface Rx (t, t') looks like a number of cubes with the edge equal
1° nmty which are placed on the t, t' -plane along the bisector of the
~?l quadrant t = t' so that the base diagonals coincide with the bisector
(1
1

'Ig. 9.26b).
9.2i · A random function X (t) is formed in the same way as in Frob-
1
em 9.26, the only difference being that the points at which the func-
19-osi5
290 Applftd Problem r n Probabtl IJ Theory

t1on assume s ne\v values nre not flxecl on tho t 'l\.IS hut occupy random
pos1l1ons on 1t rctatnt ng a const1 nt d1stanc e equ 1l to untty bet\\een them
(F1g 9 27a) All the pos1t1on43 of the referen ce po1nt relat1ve to the
sequence of the momen ts 'vhen the functton assumes ne~v "\alues are
equ 1probable F1 nd the charac ter1st 1cs of the random funct1on X (t)
Vlt the mean value 'nr ta nee and corre la t ton fu nc tton and determine
"hethe r the random functio n X (t) 1s stat1on ar}
Solutto n As tn the preced ing problem mx (t) = m::c = 0 Var.r (t)-
Varx - 1
Let us find the correla tion functJo n \\ e fi't a moment t (F1g. 9 27a)
This momen t 1s random relat1 \ c to tl e po1nts at '\\ h1ch the random

X(t)
r
I
1
1 0 I
(a) (!J)
r~g 9 27

funct1o n A. (t) assumes no'' 'alucs \\ e des1gno.te as T the ttme Interval


'\\ htch Ci:.(!p"trateQ, the potnt t fron1 the nearest po1nt at \Vhv~h a ne\\ 'alue
of X (t) '\ rll be a~sumed The random 'ar1ab le T ts un1formlv dis-
tribute d on the 1ntcr' 1l from 0 to 1 A~sume- t > t 't = t - t> 0
If 't < T then R~ (t t ) ~ 1 and 1f T > T then Rx (t t) = 0 There-
fore for 0 < 't < 1 ''e hn"\e
R:~.(t t)=P( T>'t} 1+P( T<'t) 0=P(T>~)=1-tt
Slmlla rly, forT< 0 ''e h1ve
R~ (t t ) = 1 + 't for -1<" C<0
Hence
for (Tr< f
(9 27)
for 1-rl > 1
The graph of th1s functto n 1s ~hown 1n F1g 9 27b S1nce Rx (t t) ~
Rx ('t) the random funct1o n X (t) 1s stat1on1r}
\Ve c1.n '' r1te the correla tion functio n (9 27) 1n a more conctse form
'\\ 1th the a 1d of the un 1t fu nct1on 1 (.x)
Rx (T) = (1 - r T Dt (t - j T n
9 2S The cond1ttons of Proble m 9 27 are change d so that at each
random momen t Ti dt\tded by un1t Interva ls the randon1 funct1on
X (t) ac:sumes (Indep endent ly of the other funct1ons) the 'alue Ut
'\htch ts a random vartab le \\1th mean value mu and 'ar1anc e \ arn
and reta1ns that value till the next potnt One of the realtza ttons of
th1s random functto n Is shown 1n Ftg 9 28 F1nd the charact ertstics of
th1s random functio n VIz the mean 'aloe the varianc e and the corre
Ch. 9. Random Functions 291

lation function. Determine whether the random function is station-


ary, and if it is, then find its spectral density.
Solution. Reasoning as in the preceding problem, we find
mx (t) = M [X (t)l = mu, Varx (t) = Var [X (t)] = Varu,
Varu (1-l'tl) for 1.-1 <
1
Rx(i:)= 0 for l'tl>1 = V ar u ( 1- I't I) 1 (1- I't I).

The random function X (t) is stationary. Its spectral density S~ (ro) =


Var u (1 - cos ro )/ (nro 2 ).
9.29. A random function X (t) is an alternating step function
(Fig. 9.29a) which assumes values +1 and -1 alternately at unit
intervals. The position of the step
function with respect to the referenc~ X(t)
point is accidental; the random vari-
able T, characterizing a displacement
about the origin of the first point of
'
l l J. i Yn
the sign change, is a random variable 0 l 1 I t
' L
uniformly distributed in the interval
(0, 1). Find the characteristics of the
Fig. 9.28
random function X (t): the mean value,
variance and correlation function .
. Solution. Let us consider the section of the random function X (t);
It may fall, with equal probability, on the interval on which the random
X(t}
1 T I I i !
I I
-~,_ 1).,. /~i./-.i..?-Lt
{a)

bl 1 z J 4 6t
I
5
t t+ 'C' -1
(b) (c)

Fig. 9.29

function is equal to +1 or on the interval on which it is -1. Con-


sequently, the ordered series of any section has the form
-1 1 +1
X (t): '
1 1
-2 -2
whence mx=-1 ~< -}+ 1 x~ = 0 and Varx= (-1) 2x {+(1)2x{=1.
H•
29.., A p pl ed Problem 1 in Pro bal lit 'I Theory

Let us find the correlation funct1on (-r = t - t t > t)


.,
Rx (t t } - \1 [.A (t) 1t. (t ..,- ~)1 = 1\[ [A (t} X (t + 'i)l
Su1ce the product X (I) A (l + "C) can atr:sume onl~ t" o 'a lues ( +1 or
--1) 1t f o ll o '' s t h 1 t \ l [..\ (t) .A (t -L 't) 1 = 1p, + (-1) (I -- Pt) -
2p1 - t "here 1 1 IS tho proi.JabiiJt~ that tho potnts t 1nd t t "lll +
f1ll on the tn ter' als on '" htch X (t) and X (t + 1') ha' o tho same s gn
Stnce the chsplaceinent T on rig 9 2na IS unlforml} dtslrthuted e
c,n tran~fcr the refercnco potnt to the left end of the tnter" al \\h c-b
1ncludes tl1e point t and consuler the po1nt t to be un1formll dtstr1buted
li.r {~}
Va :r= ar;lf(l yJm!
X(t)
r- v.
.... 'V
yYarv y(lf)n!
-..............--
v/. ....
' [;]_
,.-
~
\
t-o

r~ / 1 ~

r ~
(. 0
1 '" 1 f f •.

(a)
F1g 9 30

1n the 1nter' al (0 1) (f1g 9 20b) In tlus Interpretation p 1 ts the prob


ab1l1.t) that the potnt (t + "t) '' 1\l fn\1 1n etther of the 1nter~al~
(2n 2n + 1) n - 0 ± 1 ±2 (these 1nter' als 'tre marked \VItl
th1ck l1nes 1n :f 1g 9 2Db) let us calculate tb1s probabiltty for d1fferen1
v 'll ues of -r
For 0 < 't < 1 the point (t +
-r) may fall e1ther In the 1nter' al (0 1:
Qr In the tnter' al (1 2) and therefore
Pt P{t T < 1} - P{t < t - T} ~ 1 - ~+
For 1 < -r < 2 the po1nt t + -r rna~ fall e1ther 10 the Inter' iii (I 2~
.or 1n the 1n tPr\ a l (2 3) and there fore
Pt P{t + 1; > 2} - P{t > 2- -r} ~ 1 - (2- 't) - 't'--- 1
ContinUing '\ tth our reasoning '' e get
p, ~ 1 - (l" - 2n) for 2n < "' < 2n + 1
(T- 2n) ~ 1 for 2n + 1 < "T < 2n +2
It can be seen that p 1 (and hence R~ (t t + "t) = (2p1 --- 1) as" ell
depends onll on 't and ts 'lll even function of t' Consequently

Rx (t" t + -rJ ~ R:x: (t) =


g,v. f- J - 2T 1M 2n < 1' < 2n 1 +
2-r- (4n + 3) for 2n 1 < T<2n +2 +
Ftg 9 29c sho\\ s a gr1 ph of the co rrela tton fu nc t1on
9 30 A random functton X (t) Js a sequence of equ1d1stant pos1t 1' e
voltage pulses hav1ng the same ~Idth y < 112 The beg1nn1ng of each
Ch. 9. Random Functio ns 293

pulse is separat ed by a unit interva l from the beginn ing of a next pulse
(Fig. 9.30a). The sequen ce of the pulses occupie s a random positio n on
the t-axis relativ e to the referen ce point (see the hypoth esis of the
preceding problem ). The potent ial of the ith pulse Vi is accide ntal
(i = 1, 2, ... ). All the random variabl es Vi have the sa.me distrib u-
tion with the mean value mv and varianc e Var v and are mdepc ndent.
Find the charact eristics of the random functio n X (t): the mean value,
variance and correla tion functio n.
Solution. By the formul a for the comple te expect ation mx =
+
mvy 0 · (1 - y) = m v y. We find the varianc e in terms of the second
moment about the origin: a 2 [X (t)] = a 2 [Vd y + 0· (1 - y) =
(Varv + mi,) y. Hence

Varx = a 2 [X (t)]- m~ = (Varv + m;,) y - m~y


= yVarv + y (1- y)m~.
In this case the random functio n X (t) is not centred . We shall seek
ils correla tion functio n proceed ing from the second mixed momen t
about the origin:
Rx (t, t + -r) = M [X (t) X (t + -r)] - m~.

Let us find l\I [X (t) X (t + ,;)].We use the comple te expect ation
formula to calcula te it. As in the preced ing problem , we assume the
t-axis to be covered with alterna ting interva ls: the interva ls marked
with a thick line corresp ond to the pulses and those marked with a
thin line corresp ond to the spaces betwee n them. \¥e denote by T the
random value of the left-ha nd bounda ry of the interva l (T, T 1:). +
Three hypo theses are possibl e:
H1-bo th points T and T + 1: fall on the interva l of the same pulse;
llz-one of the points, T or T +
1:, falls on the interva l of one pulse
and the other on the interva l of anothe r pulse;
Ha-atl east one point, Tor T + 1:, falls outside of the interva ls of
pulses.
On the first hypoth esis the variabl es X (T) and X (T +
1:) coincid e
and M [X (T) X (T + ,;)]
= l\,1 [1' 2 ] = Varv +
m~. On the second
hyp_othesis the variabl es X (T) and X (T + -r) are indepe ndent random
va.na~les with the same mean value mv; by the theorem on the multi-
fl~catro n_ of ex~ectati?ns lVI [X (T) X (T +-r)J = m~. On the third
Hothes1 s l\1 [X (T) X (T + -r)] = 0. The comple te expect ation
1

MIX (t) X (t + -r)] = P{H1 } (Varv + m;) + P{H2 } mi;.


The probab ilities P {H } and P {H 2 } and hence the correla tion
function depend onl v on 1:~
(1) for 0< -r< 1; ·
p {1/1} = 1'- T, P{H 2} = 0, 1\1 [X (t) X (t + 't)] = (1' -1:) (Varv + mi;)
Rx (T) = (1' - 't) (Varv +
mi;) - j72 m~,
294 A pp lie d Fr ab le m s ln Pro bah ill tq Th
eo ry

(2) for y < T< 1- y


P{HJ} = 0, P{/1~} = 0, l\( (X (t ) X (t + T)) = 0,

(3) for 1 -- ~ ~ ~ ~ l
P {/ £1 } = 0, P { //2 } = y - (1 -- t')t
1\t IX (t ) X (t T •)1 = lv - (1 - -r)) m~,
R~ (t ) = (y - 1 + -r - y~) m:
T he ne\. t In te r\ al s of th e ' 1lues of t' ca n be tn \ es tt ga te d 1n th e same
\\ a \
T he gr ap h of th e funct1on R:x (t } ts sh
1/ 2 th e cu r\ e R:r (-r) 1s re pe at ed pc rt
o" "' n 1n F tg 9 30b F or j 't I>
od 1c al l) , at ta ln E ng local max1ma
eq ua l to 1' (1 --- 1') m:at 1n te gr al va lu ed po 1n ts
9 31* '' e co ns id er a st at 1o na ry ra n
~a\\ to ot h 'o l t 1.ge (F ig 9 3 ta )
d o m fu nc tt on X (t ), "b 1c h 1s a
T he re fe re nc e po tn t occuptes a random

t 0 tlt-2

{~)

po st t 1on re i at 1' e to th e to ot h, Ju st as
1n P ro bl em 9 29 F tn d th e mean
'a lu e, . th e va rt an ce an d th e co rr el at to
X (t) n function of th e ra nd om function
Solut1on \\ e ca n e1Sily fi nd th e m e1
ac~ount th n t X (t ) ha s a un tf or m dt st n va lu e na"t 1f " e ta ke 1nto
rt hU tl on on th e tn te rv al (0, t)
fo r an} t H en ce mx = 1/ 2
T o .. find th e co rr el at to n fu nc ti on , ~ e
ri g td l' co nn ec t th e sequence of
te et h 'v 1t h th e t a"tls an d thro"\\ th e be
at ra nd om on th e t ax ts (F ig 9 31 b)
gt nn tn g t of th e 1u te r\ al (tt t +
S tn ce th e te et h ar e pe rt od tc tt lS
tt)
su ff ic ie nt to th ro '' th e po tn t t at ra
nd om on th e first 1n te r' al (0 1)
dt st rt bu tt ng 1t 'v 1t h co ns ta nt de ns it y
A s ca n be se en fr om F1g 9 3l b.
JD th at ca se X (t ) = t an d th e
va lu e X (t + "t) 1s eq ua l to th e fr ac ti on
pa rt of th e nu m be r t + t', 1 e X (t t- ~> = t + 1: -
E (t + t) , \Vhere
al
E (t + 't) 1s th e In te ge r pa rt of th e nu m be r
(t + 1:)
If th e 1n te gr al pa rt of th e nu m be r -r
1s n (n ~ 't < n + 1) , th
en
n for t + -t < n 1,.
E (t + 't )= n + i for t+ 1 '> n + 1 ,
+
Ch. 9. Random Functions 295

and, hence,
t+1:-n for t<n+1-'t,
X(t+'t)=
t+'t-(n+1) for t>n+1--r.
From the formula for the mean value of the function of the random
variable t we have, for n ~ -r < n 1, +
1
M[X(t)X{t + -r)] = JX(t)X(t+-r)-1-dt
0
n+1-~ 1

j t (t + -r- n) dt + J\' t(t+-r-n-1)dt


0 n+1-~

-- _!_ (n -l- 1- -r) 2 -l- _!_ (-r- n) - .i_


2 I I 2 6
(n=0,±1,±2, ... ).
It follows that the correlation function depends only on 't' and for
n ~ T::;;;; n + 1 (n = 0, ±1, ±2, ... ) has the form
Rx (t, t') = Rx (-r) = M [X (t) X (t + -r)] - mi
= 0.5 (n + 1 - -r) 2 + (-r - n)/2 - 5/12.

This is a periodic function with unit period whose graph consists of


periodically repeating segments of a parabola which are convex down-
wards. In the interval 0 ~ -r < 1 this parabola has the form Rx (1:) =
(1 - -r2 )/2 +-r/2 - 5/12 with vertex at a point (1/2, -1/24). Setting
T = 0, we get Varx = Rx (0) = 1/12.
9.32. We consider a linear transformation of n stochastic processes
X, (t). X2 (t), ... , Xn (t) of the form
n
y (t) = f!lo (t) + ~ flli (t) xi (t),
i=1

where cp 1 (t) (i = 0, 1, 2, ... , n) are nonrandom functions of time.


We know the chatacteristics of the stochastic processes Xi (t):
mt(t). Vardt), Rt(t, t') (i = 1, 2, ... , n) and the crosscorrelation
f 0 0
unctions Rli (t, t') = M [X 1 (t) Xi (t')] (i, j = 1, 2, ... , n, i =I= j). Find
the characteristics of the stochastic process Y (t).
n
Solution. nz,•1 (t)= 2J
. 1
t=
rpdt)mdt).
c .... n o n
Ru(t. t')=M[Y(t)Y(t')l=M r22
i=i
flli(t)Xi(t) ~ flli(t')Xi(t')J
j=l
n
= ~ cpi (t) cpi (t') R 1 (t, t')
i=1
296 Applied Problem~ In Probabllltlf Theory

+ i:
i*J
rp 1 (t) cp1 (t') RlJ (t, t') + L:
t*j
rp, (t') rp 1 (t) R 11 (t', t),
n
'\tar11 (t) = R y (t, t) = }J rr~ (t) Var, (t) + 2 ~ rp, (t) ~1 (t) R 11 (t. t}
t~i i¢J

If the .stochasttc proces~es Xl (t) (t = 1, 2, •• n) are uncorrelated


(R~, 1 (t, t'):;;; 0,. ', j = 1, 2, , n)f then
1\

Rv (t, t') ~ ~ <r 1 (t) rp1 (t') R1 (t, t'),


t=l
n
Varu= ~ ~1 (t) Vari {t)
~-t

I'
9 33. G on t" o uncorrel <"ftcd random function X (t) and Y (t) wlth
characlert st tcs
m :r. ( t ) == t 2 ~ R x ( t ~ t ) ~ e« 1 U +t ) ,
m 11 ( t) = 1, R y ( t t ') === ea• et- t )',
find the characteri stics of the random functJon Z (t) = X (t) tl"' (t) + +
t2 Sol' e the same problem g1' en that the random funct1ons X (t) and
Y (t) are correlated and the1r crosscorrc lated function Rxy (t, t~) =
ae-a\1--t 1
Solut•on If R ~u (t t') ~ 0 then
mz ~ ml. (t) + tm 11 (t) + t2.-= 2t 2 + t,
R t ( t t') ~ R x (I t ) -L t t R11 ( t , t') = eo: ~ ( t +t ) + t t' e~ •et - t >s
If Rxy (t, t') = a e'\ p (~a I t - t 1). then mr (t) does not chttnget
+
R z ( t t >:=; nx ( t t ) -+- t t n11 ( t t ) t' n xu ( t, t J) + t R :tv (t, t}
= eo: (t +~ ) + t t ca ~ u t )t + a (t + t ) c- Ct f ' - ' t
l

9 34 Find the mean \ alue an{l the correla. tton function of the sum
of t'" o uncorrelat cd random functtons X (t) and Y (t) 'v1th characte.ns
tJCS
mx(t)=t~ Rx(t t )=-tt',
mll (t) = . . . . . t Ry(t 1 t') == tt' ea (t+t )

Answerll mz (t) = m ~ (t) +my (t) ~ 0, R" (t, t') = R J: (t, t') ...t- Rr~ (tt t)
= t t' r1 + eel( t+, >]
9 35 G1' en a complex randon1 funct1on z (t) = X (t) + L},. (t)t
\Vhere t IS the 1mag1nary unlt X (t)i Y (t) are uncorrelat ed random
funct1ons 'v1 th chn.racterl stles
mx ( t) ':;:: t 2 t R:... (t , t') = e- a; 1 et - t >.a,
my ( t) == 1 , R Y ( t t }=- e zo: J. H+t ) ~
find the charactert sttstics of the random functJon 7 (t) m: (t)
Rl ( t, t') and Varz (t)
Ch. 9. Random Functions 297

Answer. mz(t)=t 2 +i, Rz(t, t')=e-a,<t -t'l'+e2a,< t+t'l,


Varz(t)= Rz(t, t')=1+e4 a.t.
9.36. A complex random function Z (t) is given in the form Z (T) =
X (t)+ iY (t), where
3 3
X(t)= L} (ak+Vk)e -akt, Y(t)= L} (bk+Uk)e -flkt.
k=l k=i

The mean values of all the random variables Vk and Uk (k = 1, 2, 3)


are zero and the correlatio n matrix of t'he random variables (V1 , V 2 , V3 ,
U1 , U2 , U 3 ) has the form
1 0 0 1 0 0
2 0 0 -1 0
3 0 0 3

1 0 0
2 0
3
Find the characteris tics of the random function Z (t).
3 3
Answer. mz (t) = 2J ake-akt + i ~ bke-flkt'
k=i k=i

Rz(t, t')=Rx(t, t')+Ru(t, t')+i[Rxy (t', t)-Rxy(t , t')],


3
where R:c (t, t') = ~ ke-ah <t+t'),
k=l
3
Ry (t, t') = ~ ke-fl~t<t+t'l,
k=l
Rxy (t', t) = e-a,t'- fl,t _ e- a.t' -fl.t + 3e-a,t'- fl,t,
Rxy (t, t') = e-a,t-fl,t' _ e-a.t-fl.t' + 3e-a,t-fl,t '.
9.37. Correlation function of a product. Given two uncorrelat ed cen-
tred random functions X (t) andY (t) and their product Z (t) =X (t) Y (t),
Pfo\'e that the correlation function of the product is equal to the product
0
the correlation functions of the factors: R z (t, t') = Rx (t, t') Ry (t, t').
Solution. Rz (t, t') = 1\I [Z (t) Z (t')]; Z (t) = Z (t) - mz (t). Since
jh e random functions X (t) and Y (t) are uncorrelate cl and centred, it
01
lows that lllz (t) = mx (t) my (t) = 0, hence
0 0 0

Z (t) =X (t) Y (t) =X (t) Y (T),


R: (f, t') = l\I [X (t) }- (t) X(t') }r (t'}]
0 0 0 c
= 1\I [X (t) X (t')l i\1 [Y (t) Y (t')] = Rx (t, t') Ry (t, t').
298 Applied Probltm! n Prob~b l ty Theory

In partJcnlar for t - t
Varr (t) = Var:c (t) Var11 (t)
9 38 Pro'\e that the cone1atlon funct1on of the prQduct of 1ndepend
n
ent centrrd random functions z (t) - 1r A' (t) IS equal to the product
f=1
of the correlatton functtons of the factors
n

R: (t t } :=:=. fl RrA (I t )
i=1
Solution Tbe proof IS ~lDlil.ar
to the prccedtog one the only diifer-
ence betng that tn order to use the theorem on the multtpltcatton of
mean 'al ueq 1t IS 1nsuffictent 1n th1s case for the factor to be uncorre
lated 1nd 1t 1s c;uffictent for them to be 1ndependent
9 39 A random functIon X (t} '' J t h characteristics mx (t) - 0
Rx (t t } 1s ~ub}~cted to a. nonhomogene-ous I1ne3r transformat1on
} (t) - Lt0 {X (t)} + rp (t)
"here r; (t) IS a nonrandom func.lton F1nd the crosscorrelat1on functton
Rxy (t t )
a
So Iutton h,, e X (t)- X (t) Y (t)- L 1° {X (t)} = L/J {X (t}}
\Ve
.stnce "hen the random funct1on Y {t) 1s be1ng centred the nonrandom
term rr (t) 1< elimlnnted hence
n ~ 0
Rx11 (t t ) = l\1 [X ( t) Y (t )] -1\1 [X (t) L~fu {X (t )}J
0

=Lt~ {1\I[.A(t)X(t)]}=L 1 °){R.:r(t t)}


9 40 Characterlstu:s of the dertvattt.~ of a stochastte process There ts a
stochastic proce~s X (t) 'v• th mean value m:t (t) arld correlation functton
R~ (t t ) F1nd the chnracter•stJcs my (t) R 11 (t t ) and Varu (t) of tts
der1' at•' e Y (t) - dX (t)Jdt Also find the crosscorrelated funct1on
R:ry (t t )
Solutaon The random function l (t) 1s connected ''Jth X (t) by a
homogeneous l1near transformation Appl} 1ng the general rules
(9 0 7) (9 0 8) (9 0 D) " e o bt at n
m ( t) _ dm 't ( t) R (t t )_ 8" R :r ( t t '
~~" dt v &tat

' ar11 ( t) ~ I R u (t t ) 1-t =[ o2Ratat


x (t t ) J,_,
R:r11 (t t )-1\I[k(t) :f (t )J='I [X(t) ! X(t >J- 0~ R:r(t t)
1\ote that
0
Ryx(t t )= Rx(l t)
ot
Ch. 9. Random Functions 299

9.41. A random function X (t) has characteristics mx (t) = 1


and Rx (t, t') = ea<t+n. Find the characteristics of the random function
-fft
Y (t) = t X (t) +1. Find out whether the random functions X (t)
and Y (t) are stationary.
Solution. Since the transformation t d~t(t) + 1 is linear,

d
my(t)=t dt mx(t)+1='i,
82
R y (t 7 t')=tt' fJff}f
R X (t 7 t')=tt'rt}ea<t+t'>.

Neither X (t) nor Y (t) is stationary since their correlation functions


depend not only on T = t' - t but also on each of the arguments t, t'.
9.42. A random function X (t) has characteristics
mx(t)=t 2 --1, Rx(t, t')=2e-ct(t'-t)•.
Find the characteristics of the random functions

Y(t)=tX(t)+tz+1 , Z(t)=2t :t X(t)+(1-t) 2 ,

u (t) = d2~~t) + 1.
Solution. my (t) = tmx (t) + t2 + 1 = t3 + t 2- t + 1,
Ry (t, t') = 2tt'e-a(t'-t)'.
mz(t)=2t ;t mx(t)+{1-t)2=1-2t+5t 2
,

Rz(t)=4tt' a:;t' Rx(t, t')=16att'e-a(t-t') '[1-2ct.(t-t')ZJ,


d2 fj4
mu(t)= dt'.! mx(t)+1=3, Ru(t, t')= atzat''l. Rx(t, t').
fj2
When calculating Rz (t, t') we have found atat' Rx (t, t') and, con-
sequently,
82
Ru(t, t')= 8t8t
, {4ae-a<t-t'Ff1-2c t.(t-t')2]
= 8ct. 2 e-a<t -t')' [ 3 + 4ct.Z (t'- t)4 -12a (t- t') 2).
9.43. A random function X (t) is specified by the expression X (t) =
l' cos cot. where V is a random variable with characteristics mv = 2,
<Io=3. ,Fin~ the characteristics of the random function X(t): mx(t),
Rx(!, f). \•arx(t). Determine whether the random function X (t) is
stationary. Find the characterististics of the random function Y (t) =
X(t)+ct. dta X (t), where a is a nonrandom variable. Is the random
function Y (t) stationary?
300 Applttd Problem, 111 Probabtllty Thtory

Solution.
ms (t) = mv cos rot= 2 cos wt,
R x ( f, t'} = \,l' ar., cos rot cos m t' "= 9 oos rot r.os mt',
var ~ (t) ;;:::::: 9 (cos rotr2.
\l" e can represent the random function Y (t) thus

Y (t) = V co.s wt +a: :t V cos rot= V (cos wt ~aw sm wt),

\vhencc \\ e have mu (t) = mr: (cos wt-- a<t> Sin wt) = 2 (cos rot- ct@ Sin oot),.
R11 ( t t ) --- 9 (cos rot- a.ttl stn oot) (cos wt - a.w Stu CJ>t').,.
\'arfl (t) = 9 (cos (J)t -aw s1n wt)1
The random functlons X (t) and Y (t) nre nonstationary
9 4' A telephone C'\Change rece1 ves an elementary flo\V of requests
"1th 1ntcns1 ty A A random functton X (t) ts the number of requests
rece1" ed by the e"tchangc durtng the time t (see Problem 9 13) F1nd
the charactertsttcs of tls dert vatLve 1" (t)
s~\n\101\. \ l l \h~ n-rd \n~!"'Y sen'">~ t ~ d \'St..on\\1\U.()'\\~ t 2.. nd~m i'-'.nt.\.\-on,
\vh1ch ts a Poi~son proce~~ IS not d1 fferenttable HO\\ e\ er, us1ng the
gcneraltzed del ta--functton~ "e can "rite the charactertsttcs of the
der1 vattve The transforrnat1on Y {t) = dX (t)ldt, '\\ htch relates the
rand om fu nct1 ons Y (t) and X ( t), Is a homogeneous ltnear tra nsfor..
mat1on Therefore~ from Problem 0 13 '' e ba' e
d d
my (t) = dt m-'= (t) = dt tA = 'Jt..,
a'.J
Ry (t t ) == atat R x (t t )

- :t {a~ \htl (t' -t)+ht 1 (t-t')l)


- :t p. (t - &(t- + t ) t ) J,.J ( t - t')J'
but

whence
R 11 (t, t ) = :1 (1..1(t- t')) =A& (t- t') = A6 (-r)
Thus, the correlation function of the random fttnctton Y (t) 1s pro
port1onal to the delta func:t1ont I e the functron Y (t) IS stationary
whtte no1se '' 1th tntenstty G = A. and the average le' el m 11 =:::::=: "- The
spectral dens1ty of such a \Vh1te notse 1s
00

S~ (ro) ===
fT
1
~n:
fJ M (-&) e-.,(t)'t dt = ·· "--
2"t
Ch. 9. Random Ftmctions 301

9.45. Find the characteristics of the stochastic process Y (t) which


is equal to the derivative of the Wiener process X (t) in Problem (9.25),
i.e. Y (t) = :t
X (t).
Solution. Since mx (t) = 0, it follows that my (t) = 0. The correlation
functions of the \Viener and Poisson processes are equal (with an accur-
acy to within a constant factor). Therefore, the correlation function of
the derivative of the \Viener process (see the solution of the preceding
problem) is proportional to the delta-function and the process Y (t)
itself is stationary white noise.
9.46. Prove that the derivative of the process of a homogeneous ran-
dom walk (Brownian movement) of a particle (see Problem 9.24) is
stationary white noise.
Hint. Use the solution of Problem 9.44.
9.47. Characteristics of the integral of a stochastic process. Given a
stochastic process X (t) and its characteristics mx (t), Rx (t, t'), find
the characteristics my (t), Ry (t, t') of the integral of the stochastic pro-
t
cess Y (t) = JX (-r) d-r and the crosscorrelation function Rxy (t, t').
0
Solution.
t t t'
my (t) = Jmx (t) dt, Ry (t, t') = J .\ Rx (-r, -r') d-r d<',
0 0 0
t' t t'
Vary (t) = J JRx (<, <') d< d<', Rxy (t, t') = JRx (t, <') d•'.
0 0 0
Note that
t
Ryx (t, t') = .\ R.'C (-r, t') d-r.
0
We can prove that there is no random function X (t) different from
zero for which Y (t) is stationary.
9.48. A random function X (t) has the following characteristics:
111
x(i)=0, Rx(t, t')=[1-(t'-t) 2 ]- 1 • Find the characteristics of the
t
random function Y ( t) = ) X ( t) dt. Determine whether the random

0
functions X (t) and Y (t) are stationary.
t
Solution. Since the transformation \ X (t) dt is linear,

0
I t I'

my{l)= \ mx(t)dt=O,
~

0
Ry (t, t') = I J0
dt
0
Rx (t, t') dt'
302 Applied Pro ble mr in Pro ba b rltty T h ~ory

t '
= J(Jl1+(t'-t) t 2 1
dt' )dt
0 0
=t arctan t+t' arctan t'-(t-t,) arctan (t-f')
-fIn {(t + t 2)(1 + t'2)(1 l-(t-t')2r'J·
The random functton X (t) IS stationary Rx (t .. t') =: R x (t - t')~
(t) = r X
the random functiOn y
' (t) dt lS nonstahonary Indeed, the
0
variance of the random funct1on ts Var 11 (t) = R 11 (t, t) = 2t arctan t-
ln (1 + t,), 1 e the \ ar1anco depends on t
9.49 A random funct1on X (t) '\\ 1th characte-rtst1cs m% (t) t2 3, = +
Rx (t t') = 5t t' IS subJected to a l1nrar transfotmatJon of the form
t
y (t) = JTX ('t) dT+ t!,
0

F1n d the charac ter1.s tics of the rand om fu nctton Y (t) m 11 ( t)i R11 ( t, t') •
Solution.
~

m11 (t)=) 't(•2+3)d't+t 3 = ~ +ft2+ts.


0

The homogeneous part of the ltnear transformat1on betng considered IS


t
£1°' {X (t)}""") 't X(~) dt.
(}

Consequently.
I t t t•
R, (t, t') = ! 1n'
0
d-r
0
R.: (-r 't') dT' = 5 ~ n ( ~ -r'T' d't"') d-r
0 0
=--£- t3 (t') 3

9 50. A random function X (t), "1th characteristrcs mx (t) = 0 and


R x (t, t ) = 3e -(" +t l ts subJected to a linear transforma tJ on of the form
• t
y (t) = - t :t X (t) + .f -rX ('t") d"'
0
+SID ffit.

F1nd the cova.rza.nce of the random vartables X (0) and Y (1} (J e of IWO
sect tons of the random func t1ons X (t) for t = 0 and Y (t') for t' = 1).
Soluh on. On the basrs of the sol ut1on of the preced1ng problem
R:tu (t, t') = L1~, {R$. (ts t')},
Ch. 9. Random Functions 303

where £!9 1
is the homogene ous part of the linear transform ation applied
to the argument t'. In this case
t'
8 -ct+t '>
Rxy (t, t') = - 3t' e fJt' +3 J1 't'e-<1+1'> do'
0
= 3t'e-<t+t'>+ 3e-t [e- 1' ( - t' -1) + 1] = 3e-t (1-e-t').
Setting t = 0, t' = 1, we obtain
Covx (o), y (ll = Rxy (0, 1) = 3 (1-e-1) ~ 1.90.
9.51. A random input signal X (t) is transform ed by a relay into a
random output signal Y (t) which is in a nonlinear relationsh ip with
X (t): Y (t) = sign X (t), i.e.
1 for X (t) >
0,
Y (t) = 0 for X (t) = 0,
-1 for X (t) < 0.
The input signal is the random function X (t) considered in Problem
9.15. Find the distributio n of a section of the random function Y (t)
and its characteri stics my (t), Ry (t, t').
Solution. The random function Y (t) can assume only two val-
ues+ 1 and -1; the value 0 can be neglected since P {X (t) = 0} = 0.
00
·'
The probability that X (t) > 0 is p = ) CJl (x) dx. The ordered series
0
of the random variable Y (t) has the form

Y(t): I +1
-1 .
1-p I p

Hence my = 2p - 1, Vary = 1 - (2p - 1) 2 = 4p ('1 - p).


Assume T = t' - t and t' > t. If no event occurred in the Poisson
flow during the time ,; (and the probabilit y of this is e-'-'t), then the
values of the random function Y (t) and Y (t') are equal to each other-
and the conditiona l correlation function Ry (t, t') =Vary = 4p (1 - p).
~ow if at least one event occurred during the time 't', then Y (t) and
~ (t') are uncorrelat ed and the conditiona l correlation function R Y (t, t')
1s zero. Hence, for t' > t
Ry (t, t') = e-l.qp (1- p),
and in a general case (for any t and t')
Rv (t, t') = Rv ('t') = e-'- I 't 14p ( 1- p).
9.52. The random input signal X (t) considered in Problem 9.15 is
transformed into a random output signal Y (t) by means of a relay
:304 Appl1ed Problems in Probabrllty Theory

w1th a de1d h1nd


s1gn X ( t) for IX (t)l > B,
lr(t)= 0 for rx (t)[ < e,
"here e 1s the tlca d band of the rela}
F tnd the d 1s trtbu t 10 n of the sect ton of the random funct 1on Y (t) and
1 ts chnractcr1st 1c~ the mean v (l} ue .n nd the corre I at 10n function
Solut1on The random \artablo Y (t) may assume one of the three
'a.I ues -1,. 0. t., for any t and has an orderod ser1cs

y (t)
-t 1 o 1 +t
'
Pt ' P2 ' p,

-e:
p 1 =P{..\(t)<-e)=) q1(x)dx,

p 2 =P{IX(t)l <E}=) ""'


cr(x)dx, p 3 = t-p,~p 2
-e
Hence my= Pa- Pi \rar11 = P1 t Ps- (P3- Ptr?:
Re'l~ontng as '\ c dtd 111 the preceding problem, \Ve find the corr~
latton functton
Ry (-r) = e ~ 1 '( 1 {Pt + P3- (p3- Pd 2
]

9 53 A random function X (t) lS transforn1ed tnto a random funct1on

Y (t) h} means of a nonlinear dev1ce \Vhose operat1on ts descr1bed by


the formula
-be for X(t)<-e,
Y(t)= bX(t) for ]X (t) I < ~,
be for X(t) > E
The graph of the relat1on<th1p of y (x) 1s sho" n 1n F1g 9 53a
The random function X (t) considered 1n Problem D 15 arrtves at
the tnput of the device F1nd the uni\ art ate dtstrlbution of the random
Ch. 9. Random Functions 305

function Y (t) and its characteristics: mean value and correlation


function.
Solution. The random variable Y (t), a section of the random function
Y (t), has a continuous distribution in the open interval ( -bs, +bs)
and, in addition, discrete possible values -bs and +bs with a nonzero
probability. Thus the section Y (t) is a mixed random variable whose
distribution function F (y) is continuous on the interval ( -bs, +bs)
and suffers discontinuities at the end-points of the interval (points
-be and +bs). The jumps of F (t) at the points of discontinuity are
-e
P{Y(t)=-bs}=P {X(t)<-s}=) cp(x)dx=p 11
-eX>

.
00

P{Y(t)= +be}=P{X(t)>s }= ~ q>(x)dx=p 2 •


e
Let us find the distribution function of the random variable Y (t)
jn the interval (-be; +be):
Y/b
F(y)=P{Y(t)<y} =P{X(t) <f}= J cp(x)dx
-oo
y/b
=p1 +~ cp(x)dx(-bs<y<b e.).
-e
The graph of the distribution function F (y) is shown in Fig. 9.53b .
. The distribution density of the mixed random variable Y (t) in the
Interval (-be, +be) is equal to the derivative ofF (y) on that interval:
1
f (y) = F' (y) = T cp (y/b) for - bs < y <+be.

The characteristics of the random function Y (t) are


be
my (t) =my= -bep 1 +bep2 +-} J yep (f) dy
-be
e
=be (p 2 - p 1) + b Jxcp (x) dx,
-e
Vary (t) = a. 2 [Y (t)]- m~ = (eb) (p + p
2
1 2)
be
+-} J y2cp(f)dy-m~=Vary.
-be
B~ ~nalogy with the preceding problems Ry (,;)=Vary e-i..hl,
T7 :n4. Consider a random function X (t) = W cos (w 1 t - 8), where
I Is a eentred random variable with variance Var w• 8 is a random
~0-0575
306 Applied Probl~m1 tn Probablltty Theorv

var1able dtstrJbutcd 'v1th constant dens1ty 1n the Interval (0, 2=t)1


and w1 1s a nonrandom parameter (w 1 > 0)~ The random variables W
e
and are mutually Independent. Find the mean value and the cor~:ela­
tlon functton of the random funct1on X (t) Determine whether the
random function X (t) lS sta t1onary and crgodtc. If 1t IS stationary,
then find 1ts spectral dcnsi ty S x (w)

Varw/Z I?!('C')

0
7/(Zw,)

F1g. 9.54

Soluhon. \\7 e represent tho random funct1on X (t) 1n the form


X (t) = w cos (rolt- e) = w cos a cos mlt + Wsln e SlD mlt,.
\Ve designate lV cos = e ut
lV Sine = v and find the matn charac-
tertStlCS of the system of random variables U and V
11 [U J = l\1 [lJT cos OJ= 1\1 I lV] 1\l f cos 9} == 0,
~~[VI=~~ {~l' ~In 0] = l\1 (lV] ~~ [s1n El] =0,
Var [U] = :t\1 [(TV cos El) 2 ] = ~lllV 2 J :r-.1 [cosz B)= Varw l\1 [eos3 9J~
Var ( VJ = ~r ((JV Sin 8) 2 1= i\I f IVZjl\t (s1n 2 8J = Varw Af [stn.! Blt
Ruv = 1\1 {JV cos 0 JV Stn 8] == Var wl\1 [sln e cos BJ~
SinCe the value of e lS dtstribu ted uniformly ln the Interval (0, 2n),
we have

2n
l\l{s1n 8 cos 8] = rJ s1n z cos z i3t dx =D.
2
0

Thus 1\l [ Ul = Jtl [V] = 0, Var [ UJ = Var [V] = Var10/2. R 11 D :::: 0~~
Consequently, the expresston
X (t) = TV cos (oolt - e) = u cos rolt + v SID (l)lt
IS a spectral decomposttlon of a stattonary random function: mx = 0,
and the correlatzon function has the form R% {t'"} = (V.arw cos Wr~)/2
The graph of thts functton IS shown 1n Frg. 9 54
The random functton X (t) ts not ergod1c stnce the charaetertsttcs
found from one real1zatton do not coinCide w1th those determined from
several real1zat1ons~ Indeed, every realization of the random funct1on
X (t) IS a harmontc oscxllattan whose amplttude 1s a value assumed at
• Ch. 9. Random Functions 307

random by the variable W. The time average of each realization is zero


and coincides with the mean value of the random function X (t), hut
the variance and the correlation function, found as time averages
for one realization, do not coincide with the corresponding character-
istics of the random function X (t). For instance,
T T
lim
T-+oo
2 ~ JX
-T
2
(t)dt=lim
T->-oo
2 ~ JW
-T
2
--}f1+cos2(ro 1t-G)]dt=--}W2 •

Let us fi.nd the spectral density of the random function X (t). We


shall show that it is proportional to the delta-function: S x ( ro) =
Varw6 (w - w1 )/2 (0 < ro < oo). Indeed, for such a spectral density
the correlation function
00

Rx('t')= JSx(ffi)COSffi't'dro
0
00

\
= j Varw ( ) COS(J)'t' d ffi= Varw COSffi1't'
6 (J)-(J)i 2 1
2
0
which coincides with the correlation function for X (t). And since the
direct and inverse Fourier transformations define the spectral density
and the correlation function one-to-one, the expression for S x ( ffi) writ-
ten above yields the spectral density of the random function X (t).
If we use the complex form of the Fourier transformations rather
than a real one, we get the spectral density S~ (ffi) in the form
s~ (ffi) = Varw [ 6 (ffi +COt)+ 6 (ffi -ffit)J/4 ( - 00 <co< 00 ).

Note that by analogy we could also write Sx (ffi) = Varw [6 (co+ ro1) +
B(ro-w 1)]/2, but O(ffi+ffi 1)=0 for positive ffi(since ffi 1 >0).
9.55. Show that the sum of elementary random functions of the form
00

X (t) = mx+ 2j W 1 cos (ffitt


i=O
+ 8 1),
\vhere W 1 are centred random variables with variance Var 1 (i =
0. • 1, 2, ... ), 8 is a random variable uniformly distributed on the
1
ln~en·a~ (0, 2n) (i = 0, 1, 2, ... ) (all the random variables ei wi,
hemg Independent.), is the spectral decomposition (9.0.24) of the
random function X (t).
Solution. In accordance with the solution of the preceding problem
+
H'i cos (ffi 1t- 8 1) = U 1 cos ffi 1t V 1 sin ffi 1t,
wh ere U1 and l'1 are uncorrelated random variables with zero mean
Ya1ues. Consequently
00

X (t) =mx+ 2j (U 1 cos ro 1t+ Y 1 sin ffi 1 t),


i=O
and this is what we wished to prove.
20•
308 Applltd Problems ln ProhaiJility Tk'!ory

n
9 56 Cons1dcr n stochastic process Y (t) = ~ a 1X 1 {t) +b where
i=d
X1 (t) aro stat1onary uncorrelatcd stochastic processes '' tth character
1sttcs m 1 R 1 ("t) and St (<D) (i- 1 2 n) a" b bctng real num
ber~ F1nd the chnr1cter1'qttcs of the stochastic proce~c; } (t)
n n n
Ans" cr m11 - ~ a1m 1
i-1
+b R 11 (T) = lJ
i-1
a1Rl (-r) s: (ro} = ~ a1S; (ro)
·~1

The stochastiC' proc~~s } (t) IS st'\tlonar}


n
9 57 Con ~lder a stochastiC proce~s y (t)- II
I-.;
xf (t)
l
\Vhere Xt (t)
are 1ndependcnt st'ltionary ~tocha~tlc processes w1 th chnracterJStJcs m1
= 0 Rl (l'} andSi (oo) (l- t 2 n) Ftnd tl e~o character1shc~
Ans"er
n

J n Rr (-r) ei"'-r d-r:


"DO

-1 aol~t

The process Y (t) 1s st 1.t1on 'lry


9 58 F1nd the charactertsttcs of the random function X (t) "ho~e
spectral decomposttton IS g•ven 1n Problem 9 55
Solution 'Ve des1gnate lV1 (cos w 1t - 0) = Xi {t) and then
CIO

X(t)=m;(+~ X 1 (t)
, .... 1

In accordance ""1th the co.olutton of problems 9 54 and 9 56 "e have


0(1 00

l\1 [..\ (t)l- m:x: Rr ('t) = h Rr 1 (•) =-} h Var, cos m 't
i-1 i-1
0:.

S~ (ro) = ~ v:r, [6 (ro+ w1} + B(lil- ror)l (- oo < m< oo)


i-1

The random function X (t) ts stattonary but not ergodtc


9 59 Cons 1der a product of t" o uncorrel a ted sta t1o nary random
functions Z (t) - X (t) Y (t) the random function X (t) be1ng the
samo ns In Problem 9 14 (a random alternatton of the values +1 and
-1 'v1th an elementary no,v of stgn changes) and the random functton
Y (t) the same as 10 Problem 9 54 Ftnd the characteriStiCS of the
random function Z (t)
Solutaon \Ve have mx (t) - m 11 (t) = 0 m, (t) = 0 Rx (t t) =
e 21. I 't I and R y (t t ) = (Var w cos w1 't)/2 (,; - t - t) On the bas1s
of Problem 9 57
Rr (t t )- Rx (t t ) R 11 (t, t) = (Varwe~ 2 "-1" cos m1T)/2
Ch. 9. Rand om Func tions 309

Fig. 9.59 shows one of the poss ible reali zatio ns of the rand om func tion
Z (t) which is obta ined by mul tiply ing the corr espo ndin g ordi nate s of
the realizations of the rand om func tion s X (t) and Y (t).
9.60*. The crite rion of positive definiteness. We have a func tion Rx (-r)
with properties
(1) Rx{ --r)= Rx{ ,;), (2) Rx(O )>O , (3) !Rx(,;)j~Rx(O).

Find out whe ther the func tion Rx (,;) may be a corr elati on func tion
of a stati onar y rand om func tion , i.e. whe ther it has the prop erty of
positive definiteness. Show that
the sufficient cond ition for posi tive X(tj --,-I --. r
definiteness is the cond ition that 1
the function
t
ro 1
H !---
S.x(ro)=~ ~ Rx(' t)co sro,; d,; Y(t) -!
0 1 - +-- ---1--
(9. 60.1) I I I
t
be nonnegative for any valu e of (1),
-f
(9.60.2) z~}.:X(t)Y(t)
i.e. that whe n calc ulati ng the spec -
f -
tral dens ity by form ula (9.60 .1),
we should not get nega tive valu es
of this func tion for any w. --- --
Solution. We assu me S X (w)~ :::::--
O Fig. 9.59
an d prove that in that case
the function Rx (,;) = Rx (t- t') is posi tive defin ite. We have
00

Rx (-r) = ) Sx (w) cos ro,; del)


0

iS + i S x ( w) sin wt sin rot' dro.


"" 00

= x ( ro) cos wt cos wt' dw (9. 60.3)


0 0

tl The p~sitive defm itene ss of the func tion Rx (t- t') mea ns that for
d ~te. function cp (t) and any dom ain of integ ratio n (B) the follo wing con-
1 ton mus t be fulfi lled:

I I Rx (t- t') cp
(B) (B)
(t) cp (t') dt dt';;::::O.

Let us verif y this ineq uali ty with resp ect to func tion (9.60 .3):
00

J J{ JS
(B) (B) 0
x ( <o) cos wt cos wt' cp ( t) cp (t') dw
3i0 Applfed Problemr in Probabflfty Tht.ory

00

+ JSz (ro) sin rot sin rot' tp (t) «p (t') dro} dt dt'
0
(IC ..

= JS .x (ffi} { ) cos rot !fl (t) dt) cos rot' <p (t') dt'
0 (B) (B)

+ 1stn rot (fl (t) dt Jstn rot' T (t') dt'} dro.


(B) (lJ)

Deslgnattng

\ cos (i)t!p ( t) dt ='Ill dB, ro), Jsm rot cp (f) dt = 1l't (B, <l)),
(B) (B)

"e obtatn
J ) R.~: (t- t') cp (t) Gl (t') dt dt'
(D) {B)
00

= Js= (ro) (l1p• (B, w)Jl+ Nz (B, ~)r~) d(l)~o.


0

stnce by the hypothes1s S x ( w) ~ 0


'Ve can prove that condttton (9 60 3) IS not only sufficient but also
necessary for the correlatton function to be pos1 t1 ve defint te
9. 61 * F1nd out 'vhether the functton
ll

Rs('t) = e-a:lll (cosh ~·+i- smh~ I"EI} (a >0, ~> 0)

pos~essestha pro pert1es of a correlatton f unct1 on .


Solutton. "\Ve must vertfy that 1t has the foJlowtng propertres·
(1) Rx (0) > 0, {2) k:c ( -'t) = R% (or). (3) J llx (or) 1~ Rx (0),
CIC)

(4) S;(UJ) = 2~ JR= (r) e- ~1" d-r~O


-oo
1 for any oo

Properties (1) and (2) are obviousll Let us ver1fy the others.
(3) The funct1on R =-: ( -r) 1s even and, therefote, 1 t 1s sufficient to ln-
vesttgate 1ts behav1our for 't ~ 0.

R;,; {-r) = {-e-ta:-~n ( T+ 1)- f e-Ctt+lllt ( i-_1) ,


Stnce Rx (0) = 1, this e"'{presston cannot e:tceed untty 10 absolute
valuo For a,< ~ thts cond1t1on lS not fulfilled s1nco e- ca-~)1' 1ncreases
tndeftnttely as 't ~ oo. If a = ''
~, e get R:t (-r) 5:. 1, and for a > ~ we
Ch. 9. Ra nd om Functions ~311
- - - - - - - - ~ ~ ~ ~ ~ ~
·------- - - -
~ ~-
have -R x (-r) ~ 1. Th us pr op er ty (3) is fu lfi lle d on ly for a
00

(4) S~(ro)= in
00 ~d't= ~
~ Rx(-r) e- i 2 R e { (i + f)
-o o
00

j e-<cx+l3+iooh d-r}
00

X ) e-(cx-Hiro)~ d-r + ( 1 - T)
0 0

~+ex ~-ex }
1
= 2n R e
{
ex-~+iw + ex+~+iw
~-~~~{ 1 1 }
= 2n~ (ex-~)z+wz - (ex+~>z+wz
ex~~-~~~ 2ex
n · [(ex-~)z+wz] [(ex+~)z+wz]

of a co m pl ex nu m be r) . Fo r a = ~w e ha ve
for a~ ~ (Re is th e re al pa rt
s~ (ro) = 6 (ro).
Thus for a.,~ ~ th e fu nc tio n R x (-r) = e-cx 1~ 1 (c os h B-r X +i
e pr op er tie s of a co rr el at io n fu nc tio n.
X sinh B I 't I) possesses al l th
an d S~ (ro) fo r a. ,> p ar e sh ow n in Fi g. 9. 61 a, b.
The graphs of Rx (-r)

0 0
l
(a) (h)
Fi g. 9.61
st at io na ry ra nd om fu nc tio n X (t) w ho se
9.62. Show th at th er e is no rv al ( - ' t11 -r1 ) an d
x ( -r) is co ns ta nt in so m e in te
correlation fu nc tio n R
equal to zero in its ex te rio r . fu nc -
m e th e co nt ra ry , i.e . th at th er e is a ra nd om
. Solution. Le t us as su b > 0 fo r 1-rl < -r
io n fu nc tio n is eq ua l to
tiOn X (t) for w hi ch th e co rr el at e sp ec tr al de ns ity
1
and to zero for• I 't 1 > 1 -r • W e sh al l tr y to fin d th
of the ra nd om fu nc tio n X (t):
co 'C

S x ( (J)) = ..!_ \ R ('t) cos (l)'t d-r = 2_ b cos (J)'t d,; = !!... - sin 00 '1 •
n J0 -
y
n n w
0

on th at th e fu nc tio n S x ( ro) is ne ga tiv e


It can be seen from th is ex pr es si op er tie s of th e sp ec tr al
for some values of ro, an d th is co nt ra di c. ts th e pr
312 Appl1~d Problem' tn ProbabtlUy Th~ory

dens1ty and. consequently, there can he no correlatton functton of the


ktnd descrtbed
9.63 Does the functiOn Ex ('t) = Var.r e-o: I" 1 ~ sm ~ I 't' I possess
the propertieS of a correlation functton?
Answer. No, 1t does not, Since the follo\ving t\vo cond1t1ons are not
fulfilled Rx (0) > 0 and Rl'. (0) r R~ ("t) ] >
9.64 A stationary random functton X (t) has the charactertsttcs m=
and Rx (-r) Ftnd the crosscorrclation function Rl:l (t, t) of the random
functton X (t) and of the random function Y \t) = 1 -X (t)
Solution.
R~v (t, t ) = ~I [k (t) Y (t')l ~ ~~ [ - i (t) X (t')l
= -Rx (t, t') = -Rx ("t)
9.65. A random function X (t) has the charaetertstlCS m;u R,:, (-c) =
Var:s: e-c; 1-r ~ F1nd 1ts spectral density
Soluhon.
,.
I e-
ext

SHffi) = 2~ J Rx ('t) e-llllt do= V~r., Re (tJ.+foo)"r dT t


-OD 0

where Re ts the real part .. 'Ve have

Ja.+t uo
OCI 00

rJ e-fet+ t(J))t d-r: = e-la+iw)'l' d (a+ lf:t1} T


0 0
(et+Ho)~=y 00

- for 't=O, y=O = t


a+tm
Je-u dy
0

J
CIO
1 1
= a+ tro
, Re e-ta+ial)'r d-r = Re - --
a+ iro
0
I = Re t a-loo -Re a-lw
c:t+ ~(r) a- tw - ct'+ w'
= Re { a; a~ co~ - t c:t'~ co' ) = a•~ ro• •

Consequently, S~ ((r)) = Varx a


n aa+ru~ •
9.66. F1nd the spectral density of the random function X (t) tf 1ts
correlatton funct1on

Answer.
Ch. 9. Random Functions 313:

Hint. Represent cos B• = (e 1!3' +


e-if3o:)/2 and use the solutio n of
Problem 9.65.
9.67. Find the spectra l density of a station ary random functio n whose-
correlation functio n

Solution. We have
00 00

1
S.., (ro) = -=-- ~ Var e- (',,:r )Z e- tcuo:
• __.::::_ ~ e-(A.o:)Z-icoT d-r:.
d't = _Varx
x 2n x 2n
-oo -oo

Using the familia r formul a


00

Jf e-AxZ± 2Bx-c dx =
-

.,V/ An exp [ - -.....,-


ACA-B--
2
J (A>O )
-oo

and bearing in mind that fJ. = -1, we get

S *'x(ro) = Varx ., / n
2n V 1..,2 exp
[ ro
2
-- 41..,2
J-_ 4J...VarxVn exp [ - ro
2
4J...2
J•
The graph of this functio n is like that of a normal distribution~

R (1:)= 2a sinW(C"
2aw1 x -r
a s; (w)
I I
I I
0 w, w
(a) (6}
Fig. 9.68

9.68. The spectra l density of a station ary random functio n X (t) is


chonstant on the interva l from -ro 1 to +w 1 and zero outside it, i.e~
as the form shown in Fig. 9. 68a:
S<: a for Iro I < rot _
_ ( 1 ro 1)
x(ffi)- O f or Iro I> ffit -a1 1- rot .
Find the correlation functio n Rx ('t) of the random functio n X (t).
Solution. ·
00 co
Rx (-r:) = ~ s~ (ro) eiWT dw = 2a cos W't" dro = 2a sin rot•
'
-"" (}

V lll'x = R :c (0) = 2aw 1 •


11w graph of the correlat1"on function is l':bo"'n 1·n F"Ig. 9•68b ~ - H ;
~14 .Applle d Problems tn Proba btlfty Theor fl

9 69. The dertvatlve of a statio nary rando m functton G1ven a statton


ary random funct ion With mx (t) = mx, R% (t.. t ) = R~ (~), where
't = t -- t find the chara c ter1s t1cs of 1ts der1 va t1 ve Y ( t) ~ dX (t)/dt
f

and sho'v that tt ts statto nary too


Solutaon Since Y (t) IS relate d to X (t) v1a a homogeneous l1near
trans form ation 1t follo\ vs that
d d
m11 (t)= dt mz(t )= dt m~=O=const,

~ a2 , Jl
R 11 (t, t)=o tiJt R~(tt t)=a tiJt Rz(T )

= :, ( 0~ R~ (~}) = ! (~ R>: (~) :: )


But 8-r:/8t' = 1 nnd 8TI8t == -1 and, therefore,
, a [ d
Rll (t, t ) == iJt ~" Rx ("C)
J= a~
CI1Z R$ (T)
a"'
ol == -
d'
d"ti Rx (t")
Srnce the r1gbt hand stde of the equa tion depends only on T "e have
dt
R II ( t t ) = RII ( 't) == - d"tSI R:t ( 't) t
and the rando m funct ion Y (t) ts statio nary
9 70 A stat1o nary rando m funct ion X (t) has a corre lation functton
Rx ('t) A rando m funct ion Y (t) resul ts from 1ts dtfier entt 1t1on Y (t) =
dX (t)ldt Ftnd the corro lat1on funct1on R 11 ('t') 1f
(a) Rz ('t) = e-art(,. (b) Rx ('t) = e al11 ( 1 a I 't' I), +
(c) Rz (•) = e-~l'fl (cos~·+ i- 510 ~ IT r) (a> 0, ~ > 0)

Solut 1on To solve the probl em \Ve shall we the appa ratus of gener
ttl1zed funct 1ons, the rules for \Vhich are g1ven 1n Appe ndix 6
{a) R 11 (l') = - tfl2
d•
e ct 'tl = _ d
dt
[ -ae-C lht d
d~
l'tl J
=a [ -ae t*l ( d J:l ) + d~~:l
2
e-al<[l J
= ae a 'tl [2o ( T)- a (stgn t') 2]
The prese nce of the term 2 6 (.,;) s ho'' s that there 1s ,vht te notse rn the
funct ion Y (t) Next we VvTI te the ans'' ers and 1n vt to the reade r to vertfy
them
(b) Ru (l') = et2e-a l1l ( 1- ex 11' I)
(e) Ru(•)=(a.2 +f.~)e-al-rl (eos~t--fsm~ 1•1)
9 71. F1nd the spect ral dens1 t~ of a sta ttona ry rando m funct ion 'vlth
.a corre l a tton fu nctro n
R 11 ('t) = cte "I,. •I2B ('t') ....... a (s1gn 't) 11
Ch. 9. Ra nd om Functions 315

Solution.
00

St (ro) = 2 ~ J Ry (,;) e-irot d't


-oo
00

r J (sign
00

1 1 -n ,;) 2 e-cxltle-irot d't.


= 2n j cx.e-cxlt126 ( 't) e- irot d' t- -=-2
-oo -oo

Since
1 fo r 't =F 0,
(s ig n ,;) 2 = 't = 0
0 fo r
th e se co nd in te gr al ha s no sin gu la rit ie s at
and since th e in te gr an d of
the point 't = 0, we ca n ne gl ec t 't = 0 in Syli(0)
the second in te gr al . W e ge t
- ~
- - - - ·+-cC/7(- - -
S* (ro) = ~ - o:2 2o: ro2
Y n 2n o:2+ ro2 - n o:2+ ro2 •

~he gr ap h of th e sp ec tra l de ns ity St (ffi) 0 w


Is shown in Fi g. 9. 71. Fi g. 9.71
We co ul d ha ve ob ta in ed th e sp ec tra l
density S~ (ro) in a sim pl er wa y. W e th e ra nd om
nc tio n Y (t) as th e de riv at iv e of
represent th e ra nd om fu
func tion X (t) fro m Pr ob le m 9. 70 (a ). W e ha ve

Rx ('t )= e- cx ltl , S i( ff i) = * o: 2 ~ro2 ·

th e di ffe re nt ia tio n op er at or
~he am pl itu de -fr eq ue nc y ch ar ac te ris tic of
Is cD (iro) = iffi, an d, co ns eq ue nt ly ,

St (ffi) = S i ( (i)) I<D ( iffi) 12 =! o: 2 ~ ro2 I iffi 12


=*o:2 ~ro 2 •

at io na ry ra nd om fu nc tio n X (t) w ith a co rre la tio n


9. 72. Gi ve n a st va ria nc e an d
th e co rre la tio n fu nc tio n,
function Rx ('t) = (s in 't)/'t , ftn d
spec tra l de ns ity of its de riv at iv e: Y (t) = dX (t) /d t.
69 ,
Solution. Ac co rd in g to th e so lu tio n of Pr ob le m 9.
2
R ( ) __ 2
d Rx ('t) _ _ d ( sin 't )
y 't - d't2 - d't2 't •

in g (si n 't) /'t in to a M ac la ur in se rie s, we ge t


Ex pa nd
sin 't 't2 .~ 'ts
't = '1 - 3! + 51 - 7! + .. . '
1 't4
d2 't2

R y( T }= - d't 2 R x (' t) = s- 21·5 + 4! ·7 - .. ..


Hence V ar y= R 11 (0) = 1/3.
316 Appl ied Probl~ms In PrtJbablllty Theo ry

The spec tral dens tty S;(m ) = f 1 (i. -I col) (c:eo Item iS of Appen
2 ·1 ( 1-1 wl) i
dlX 6) Consequ ently ' s~ (ro) = s; (oo) cu =2 ro 2

9 73 A random funet1on X (t) has n corr elat1 on func \)on

R,1: (T) = e (li,J (cosh~·+ fsm hfll •l) (a~P> 0)


A rand om funct1on Y (t) = dX (t)ld t Ftnd Jts corr elati on funct1on
R 11 ('t) and spec tral dens1ty (w) s;
Solution To find R u (t'}, we use the prop ertie s of generaltzed fu.nc
t1ons (see (10])

R11 (t) = - ~~~ R:~: ('t) - - ![ -e-( ll-rl a d ~;1 {cosh Pt

+i-smhP!TI)+e (lJtl{psmhfh+acoslt~l•l dJ~I )]


=
d {~"~-at
d't - p e ah:i stnb PT
)
al-P ' [ a
= en 'tl cosh ~ IT I - lf s 1nh ~ r1" J
]

* _ 2tt6)1 a1- ~~
sll ((0)- l't [(a-~}'+ CJ)lj {(a+ ~)l+w2j
S1nce the It mit ltm R" (~) ex1s ts (1t 1s equa l to R 11
(0) ct2 - ~"). =
"( ... 0
the rand om func tion X (t) 1s d1fl eren tiabl e
9 74 Sbo\v that the cros scor relat 1on func tton R~ 11 (t t ) of a stat1on-
ary rand om func tion X (t) and tts deri vati ve Y (t) ::=. dX (t)/dt
sa t1sfies the eondt t1on R xv (tt t ) = - R XJ (t ~ t) • 1 e chan ges when the
argu men ts chan ge plac es
Solu t1on Assume R -x (t, t ) = R:c ("~"),. \vhere "t t' - t =
R;r, (t, t')= M[X (t) :t X(t)]= :e M( i(t} i(t) 1=: , R~{'t)
But 't = t - t and. cons eque ntly ,

R#C11 (t, t) = ! R,. ('t) :; == ! Rz {'f)


On the othe r hand ,

R#Cfl (t' o t) = ~ Rz (t, t )= d~ R,. ('t) !; = - :. R;A. ('t}""" - Rx11 (t, t )

-a.n~ tb.?l" ~~ -wh~t ""Wts han \1) tntrve


Thu s the cros scor relat ton func tion of a stati onar y rand om functton
and Its derl vati \e depe nds only on 't = t - t
d
R~v ('t') = - d"' R.x (f),
- Ch. 9. Random Functions 317

b/, i.e. the function X (t) and its derivative are "stationary correlated".
9.75. Prove that any stationary random function X (t) and the value
of its derivative Y (t) = dX (t)/dt at the same point tare uncorrelated,
and if the random function X (t) has a normal distribution, then they
are, in addition, independent.
Solution. It was shown in the preceding problem that Rxy (-r) =
d
r'
-dt Rx (1:). For t' = t, 't" =0
d d
Rxy (0) = - d-r Rx (0) = - d-r Varx = 0.
-
r

Thus a stationary random function at any point is not correlated with


its derivative at the same point.
For a normally distributed random function, the lack of correlation
between X (t) and its derivative at the same point implies their in-
dependence.
9.76. Find the characteristicb of a random function Y (t) =X (t) X
a;/t) if the normal random function X (t) is stationary and has
characteristics mx, Rx(-t)=Vare-al-rl {cos~,;+fsin~-r).
Solution. Y (t) = + d [~t(t)]
2
• We designate X 2 (t) = Z (t). In accor-
dance with (9.0.35) and (9.0.36), we have

lnz = mi + Rx (0) = m; + Var,


00

s: (ro) = 2 ~ S~ (ro- v) S~ (v) dv + 4m~S~ (ro)


-oo

V a r a. roz (ro 2+ 2Qcx.2 + 4~2)


= " (aZ+~Z) [(ro2+4cxz+4~2)2-16~2Ctl2} (Ctl2+4cx.2)
+4.nzx2• Varcx. 2(cx.2t~2) 4 ~~z =Si• (ro).
l"t (Ctlz+cx.z- ~)~+ a~

We can find the correlation function Rz (-r) from the expression


00

Rz (1:) = ) Si (ro) eiro-r dro.


-oo
Hence
1 d 1
S~(ro)=
my(t)= 2 dt mz=O, 2 rozS~((u).

0· 77 · Quadratic rectification of a stationary random function. A quad-


ra ~ rectification of a random function X (t) is a nonlinear transfor-
1

Inahon of the form Z (t) = a2 X 2 (t), where a is a constant and X (t) is


~random function. Find the characteristics of the random function at
e output. of n quadratic rectifier with parameter a = 2 if a normal
318 Ap p lted Prob lt m1 tn Pro I; a btllty Th~ory

stat ton ary function '\ 1th cha ract erts tics m" = 3t Rx (-r) = 4e-3 ltl
arr1ves at tts Inp ut
Solution~ \Ve des tgna te X 1 (t) = aX (t) The charactertsttcs of the
2
ran dom fun ctto n X 1 (t) are mx, = am: r = 6, R~ 1 ("t) = a Rx (t) =
16e-31 "\ The n Z (t) =X i (t) ''re set Var1 = 16,. a 1 = 3 Fro m formula
(9 0 35) "e find tha t m: = m1 1 + Rx 1 (0) = 36 + 16 = 52 From
form ula (9 0 36)
00

s:(c o}= 2 j S;1 (oo-v)S;1 (v)dv+4m~ 1 S; 1 (w)


-oa
00
Var1 a 1 1
=2J -co
Var 1 a 1
n
1
«1 +tro---v)2 "t a.l~+ v~ dv
\rar 1 a 1 t 4 Vnr f a 1 t 4 Var 1 ct1m} 1
+- 4m~ 1 !t n (2ct 1 )~ + ro"~
at+(Jl2 -
~

+ n (af +w~)
Con seq uen tly (see Item 6 In Append1x 7),
...., .......
R:. ( t') = Rt ('t") + R 2 ( -r) = Var l e-O: tlt I + Vat2 e- as!-r} t
_,
~ p

a 1 = 2a1 = 6~ Var~. =
~
2
whe re Var1 = 2Va rl = 2 X 16 = 512"
4Va r1 m~ . = 4 X 16 X 6 2
= 2 304, a 2 = 3 -
Thu s the stgn al at the out put of a qua dra tic rc.ct1fier can be represent.
ed as the sum of t" o unc orre la ted stat 1on ary stoc hast ic. proce~ses
Z (t) = X 1 (t) + X 2 (t) w1t h the cha ract eris tics Ind tcat ed above It
sho uld be emp has ized tha t the stoc has tic proc ess Z (t) ts not normal
9. 78. A nor mal rand on1 fun ctio n X (t) has cha ract eris tics mx, Rx ("t) =
Var x e-ah ! F1n d tho charac teriStlCS of the rand om fun c t1on JV (t) =
X (t) dX (t)
dt •
Soln t1on
W (t) =X (t) d!t( t) =+ ! xz (t),
S~ (ro) = 0 5ct12 S:s (ro)~
Smce the dtff eren ttat wn rs hne ar, we hav e mw (t) = mx• (t) But f#;
1n accordance w1t h the solut1on of the prec edin g pro blem , ~~ (t) ===
canst and, consequently~ mw (t) = 0 In the prec edin g problem we
fou nd the spe ctra l dens1 t y S ~· (ro), and , con seq uen tly,
s•10 (00) = _!_ oo2Slll (ro) = Var ~ 2a ro! + 2 Var:r m~a2m1

2 x~ 11: (!)1+ (2a) J n (wa +a )

and tn acc ord anc e Wit h 1tem iS of App end 1x 7 \Ve hav e
Var 2
R m ( 't) = :t :t e- 2(Lh;f (26 ( 't') - 2a ( stgn T) 2 I

+ 2 Var m'
n
x ~ e-« ltl [2<5 ('t) - a; (s1g n 1')2]
Ch. 9. Random Functions 31S>

We infer from the form of the correlation function Rw ('t') that the-
rand9m function W (t) contains white noise.
9.79. The angle of displacement of a radar in the horizontal plan&
is a normal stochastic process X (t) with mean value mx = 0 and'
correlation function Rx ('t') = Varx e-o:\,; I cos ~'t', where Varx = 4 deg2 ;
a= 10-3 1/s; p = 0.1 1/s. At the initial moment t = 0 the displace-
ment angle is zero: X (0) = 0. Find the probability p that at the moment
t' = 0.2 s the displacement angle will be smaller than 1°.
Solution. We designate X (0) = X 0 , X (t1 ) = X 1 • Under the con-
dition that the random variable X 0 = x 0 , we can find the conditional'
distribution of the random variable X 1 from the expression f (x1 I x 0 ) =
f (x1 , x 0 )/f (x1 ), where I (x1 , x 0 ) is a normal distribution of a system of
two random variables with characteristics m 1 = m 0 = 0;
Var1 = Var0 = Varx = 4 deg 2 ,
Rx1 , xo = R.-c (0.2) = Varx e-o:IO.ZI cos~ X 0.2 = 3.68 deg 2 ,
rx 1 , xo = Rx1o x/VVarx1 , Varx 0 = 0.921.

The conditional distribution I (x1 I x 0 = 0) is normal with charac-


teristics

Hence
1
P=) f(x 1 1 x 0 =0)dx=2<:D { 0.~ 76 ) =0.81.
-1

9.80. The input of an oscillatory unit of an automatic control system.t-


whose transfer function has the form
<:D (p) = ki(Tp 2 + 'gp + k) ('g > 0),
picl~s up white noise with spectral density S~ (w) = N. Find the-
~ariance of the output signal (we speak of sufficiently distant time·
Intervals after the termination of transient processes).
Solution.
St (w) = S~ (w) I <:D (iw) 12 = Nk/(1 T (iw) 2 + ~iw + k 1 2 ).~
whence
00

Vary= JS~(w) dw=1tkN/~.


-oo

t" Note that the variance of the output signal does not depend on the
cl~e constant T of the oscillatory unit, but depends only on the amplifi-
a 10 11 factor k, the damping factor ~ and the power of the signal N.
'320 Apptled Probl~m• In Pra'babUUy Theory

9.81. The transfer funct1on of a system to which a signal X (t) lS sent


bas the form

"\\here k = 25 ..!..
8
and T1 = 0 05 s
The spectral density of the Input s1gnal
S; ( w) = 2T6rfl j + (Tro) J, 2

''here T = 1 s, and ~ :r.: === 4 deg~/s 2 r1nd the var1ance of the output
.s1gnal
Solution.

DO

Var11 = J8~ (ro} dw


0

_
- -a~b 0 +a 0 b 1 -a 0a1 b /a 3 2
t
2a 0 (a 0a 3 -a1a 2 )

1n th1s case b 0 = 0, b1 = -Tl, b 1 = 1, a 0 = TT1 t CZt = T + T1•


..a 2 = 1 + kTt a3 = k Then
Var11 = 4nTc5 bt- a1brla 3 ~ 0 0428 deg 2
:r 2 (a 0 a3 - a 1 a 1 )

9 82 The output s1gnal Y (t) 1s rel1ted to the Input stgnal X (t) by


the d 1fferent1al equatIon

a! dX (t)
dt
+aDX (t) = b dY (I)
t dt
+ b Y (t)
(I
(9 82)
I
"The stattonary random functton X (t) ts normal w1th characteriStiCS
mx and Rx ("f) = Var~ e-a. I""~" I F1nd the characterrsttcs of the output
'Slgnal Y (t)
Solution. S1nce the random functton X (t) 1S subJected to a stattonary
ltnear transformation, the random funct1on Y (t) 1s stattonary In the
.steady state cond1t1on The 'ar1able m 11 can be found from the equatton

ai
dmx
dt
+ bb dmu
+ aomx = i -dt- om,.
'B'lTtte mx -= "t.tl'tl.~\ trna m 11 = cons't 1t io~\ows \hat a 0mx = bomut
whence mv = aGmxlb 0 Equation (9 82) can he wr1tten 1n an operator
form, 1 e ,
Ch. 9. Random Fzmctions 321

whence
y (t) = ~~;t:: X (t) = cD (p) X (t),
where cD (p) is a transfe r functio n.

Var x a 1 a5 + airo2 Var x a 1 a5+ airo 2


- n a2 + ro 2 b5 + b1ro2 nbi
~~~
a 2 + ro2 bij/bi+ro2 •

We can rewrite the express ion for S~ (ro) in the form

S*y ((J)) = Varxa


b2
n t

where
2 2b2fb2
ao-at o 1
a=a1 -c ' 2
c= ' b=a,
a - b2fb2
2 o 1

We designate D1 = Varx aa/b, D 2 = Varx ac/(bfd). Then (see item 6


of Appendix 7) R (1:) = D 1e-b I 1: I D 2e-d I,; I. The varianc e of the output
signal Vary = if.Y (0) = D 1 Dz. + . .
9.83. An RC-filt er is used to lower the level of no1se X (t) (F1g. 9.83).
The filter picks up station ary white noise with spectra l density S~ (w) =

0>---- --111 --- ...._.--o


c '

R
X(t)
X(t) R Y(t}
C=*= Y(t}

Fig. 9.83 Fig. 9.84

2
10-s V s. Find the least time consta nt T = RC of the filter for which
the output signal will exceed 100mV in absolu te value with probab ility
P ~ 0.05 if mx = 0.
Solution. The transfe r functio n of the RC-filt er has the form cD (p) =
1/ (1 + Tp). Conseq uently,
S* (w)
Y
-I
-
1
i+Tiro
1S* (w) -_
2
x
1
1+T2 ro2 •
10 _5 -_ Vary
n
a 1
az+roz '
Wller? a... = liT 1/s, Vary = na X 10-5 V 2• Hence (see item 5 in Ap-
pen~tx 1) Ry (-r) =Vary e-a:I,;J.
th Smce the normal input signal is subject ed to a linear transfo rmatio n,
lToutpu t signal is also normal with charac teristic s my = 0, Vary =
(
~ ) ;o-s V:!. By the hypoth esis p = '1 - 2<}) ('100/cry) = 0.05, where
foJ~) lS~he error functio n (Appen dix 5). We nhaYe (j) (100/ay) = 0.475,
Tcry- 1.9G, cry= 100/1.96 = 5'1.1 mV, aii =Vary = 26.1 x 4-~ vz.
he smalles t time consta nt T = (Jt/Vary) 10- 5 = 0.012 s.
21-0575
322 Apphed Problemr tn ProbabflUy Theory

9.84. The Input of the RC-filter sho'\\ n In F1g 9 84 p1cks up a normal


stationary random functton X (t) '' tth characterl sttcs nl,., R~ (t} ==
Var x cos ~ -r F1 nd the char actertst 1cs of the out put stgnal Y (t)
Sol ut1on The tran qfer funct 10n of the filter has the form ttl (p) =
Tp (1 + Tp)- 1 , where T = RC Consequently., m 11 = 0 (the constant
componen t does not pass through the capacitor C) The spectral density
of the output s1gnal (see Item 3 of Appendix 7) bas the form
* 12 • (Tm)2 R)
S~' (ro} = 1 + Ti~ Sx (w) =- 1 + (Tw)a Var:.: [B (ro ~) .J... 0 (oo- p 1
Tt6l

,,,.e can f1 nd the ' ar1 anee Vary of the out put Signal from the ex pression
(~ee Append1"t. 6)

-oo
OQ

~ Varx f (Tro)2 2 {T~V'


~ 2 J 1 + (Too)t ( 6 (w + P} + 6 (w- ~)] dm =- Var~ 1 +<T~)'~
-oo

The univariate distributi on of the output s1gnalis normal With param


eters m 11 = 0 and Var 11
9.85 The tnput of an automat1c 'oltage regulator rece1ves a voltage
X (t) 'vhtch IS a normal stationary stochastic process 'v1th character1s
ttcs m :r. = 220 v R X ( t') = v ar X e-a ' '( t ( 1 +
a I 't t 1\ here Var:: = n
ai = 16 V 2 t a= 200 1/s Tho volt "lge regulator operates normally 1f the
voltage X (t) 1s no lo\\Cr than 208 V and no higher than 232 V,. otherwise
tt automatic ally S\\ 1 tches off and S'\\ tlches on egatn "ben the Input
voltage X (t) returns to the tnd1cated range F1nd the probabtltt y P
that the regulator 'vtll be operat1ng at an arbitrary moment t, the
dtstr1but1 on of the durat1on of the normal operatton T1 and the average
t
durat.lon of the perJod 2 -nhen the operator J.s .swJtcbed ofi
Solution In accordanc e "\\ 1th the solution of Problem 9 70 (b),

R 11 (t} = - a' .. [Varx e ... c1,.1 (1 + et [,;] )J


012

= V a r ~ a.2e- at tf ( 1- a. 1't J ) a11


J = a a•.
The average number of t1mes the level a Is crossed ts

A~-- 2" { exp [ - {a;~x)


1 11
= 0 354 1/s J} ::
Conseque ntly, the total average number of t1mes 1t crosses
the perm I ~sihle level 1n unt t t1me 15 2Aa = A1 === 0 708 1/s For all
practical applicatio ns '' e can consider the distribut1 on of the random
vartahle T2 to he exponent1 al w1th eypectat1o n f = 1/'A1 = f 41 s
The probab1ltt y that the regulator wtll be operating at 1.n arb1trary
moment 1s
p = P (208< X[(t)< 232) = p (I X (t)- m:r 1< 12) = 2<D (3) = 0 9973
Ch. 9. Random Functions 323

On the other hand, we can find the probability p from the expression
p =tAt1 + t 2 ), whence t 2 = ~ ('1 - p)lp = '1.3'1 X '10- 4 s.
9.86. A radio device can function only in a certaiu temperature
range t 0 ± 30 °C, where t 0 is the average rated temperature. When the
temperature passes either limit, the device fails. The ambient tem-
perature X (t) is a stationary random function with mean value mx = t 0
and mean square deviation ax = '10 °C, and with a correlation function
of the form Rx ('t) = 100 (sin 't)/'t °C 2 ; the distribution is normal. Find
the average number of times the temperature crosses the levels t 0 ± 30 °C
during a time period T = 50 s. Assuming the number of level crossings
to have a Poisson distribution, find the probability that the device
will fail during the time T.
Solution. According to the solution of Problem 9. 72
82kx ('t) { 1 't }
Ry ('t) = - a.2 = -1oo - s+ 215-.. .

Setting 't = 0, we find Vary= Ry (0) = 33.3, ay = 5.76 °C. According


!o f01·mula (9.0.42), the average number of times the level a = rnx +
30oc is crossed upwards is
!..a= 1 e-a 2f200 cry = 0.00134 1/s
21t ax
The average number of downward crossings of mx = -30 oc is the same,
hence the total average number of times the permissible limits are
crossed is 2f...a = 0.00269.
For the random variable Y, which is the number of times the level
is crossed and which has a Poisson distribution with mean value 2/..aT =
0.00269 X 50 = 0.1345, the probability that it exceeds zero (at least
one level crossing occurs) is 1 - e- 0 •1345 = 0.125.
CIIAPTCR fO

Flows of Events.
I\farl\ov Stoc1Iasttc Processes

tO 0 A fiow of tr.~nts 1s a sequence of homogeneous events occurnng one nrter


another at r~ndom ntoments l:.x1.rupl~s .are a flo'v of request! at a telephone et
change the goals scored durtntt an 1c~ hocke) gamr the occurrence of faults dunn~
the operatton of a computer the JOb reque.sts at a compu\lng centre nnd so on
A no\\ of events can be depJct.Pd by a serJes or points With nbsczssas el et
en (Fig 10 0 t} '\ lth Inter\ als TJ = el - el
T I= e,- 81
Tn === en +t - en In proba.bzltty thpOr) a flO" of events can he represented a5
a ~equence of ra. ndorn variabl PS el s 2 = el T1 +
= el e3' T1 + +
Tt
Note that the terlll e-.... ent 1n the concept flow of evPn t.s d 1fJers In ccense from
the term 'random event "'hJcb we 1ntroduced ear her It 1! sensele-ss to !peak cf

,.,Tt T2 T3 Tn
r '"'-~I -
1-.
~ ,....A--.
t
0 e, 9z eJ en+f

r1g 10 0 t
the probabthty of events 1n a llo'v (sa} the prohabthty of a request" at 1. tel.: phone
exc1:iange SlnCe sooner or l1ter a request 'v1ll arr1ve and not JUSt one A flow cf
events can he assoctated "1th ''lrtous random eve-nts for Instance
A = {at least one request 'v1ll arr1\e dur1ng the HJ.terval
from 10 to t 0 't} +
or
n = {t\\ o requests "tll a1T1\e dunng the same tnterval}
The pro ba b1 I1tt es of t hesc events can be ca 1cula to d
It must be po1nted out that "'e can depict as a ser1es of po1nts not a flow of
events 1t.se l f (Lt 1.9 a cc1den ta1) but 1t 6 cer ta 1n s pectfi.c rea h sat 1on
\Ve mentioned 1n Chapter 4 flows of events and sollle or thetr properbes but
we shall no~ consider them here 1n more deta1l A flo~ of events IS stationary 1f
the proba h1h ty charac ter1stlc"= do not depend on the c.ho 1ce of a reference ~o1nt or
more speczfi.call} 1f the prob1.b1hty that a partu~ular number of events 'v1U /all on
any tune Interval depends only on the length T ()f the Interval and does not depend
on Its place on the t ax1s
A flow of ev-ent.s 1s sa1d to be ordtnary If the probab1hty that two or more events
may fall 1n an elementary tlme tnterva f!.t 1s neghgtbly small ag compared to the
probahdtty thlt one event may fall on that 1nterval The ord1nar1ness of a flow
mean~ that events 1n 1t occur one at a t1me and not 1n ~oups of two three etc
(that t '' o evell.ts may occur sunul taneously is theorettcall y possthle hut has zero
probab1I Lty)
Au ordtnary t1o'v of events ~an he Interpreted as a stochastzc process X (t) I e
the number af events tnat occurred up to the moment t (Ftg ~0 0 21 The stochastlt
process X (l) lDCreases JUmpwiSe by unlty at the polnts el 92 en
A flow of events lS known as a flow wlthout on ajtrte:IJ,ct tf the number of event 3
f.alhng on any t1me Interval T does not depend on how many events fell on any
other nonoverlapping Interval In essence the absence ol an aftereffect Itl a now
means that the events whtc!l form the Oow occur at certa1n moments Independently
of ont another
Ch. 10. Flows of Events. Markov Processes 325

A flow of events is said to be elementary if it is stationary, ordinary and has no


aftereffects.
The interval of time T between two successive events in an elementary flow has
an exponential distribution
f (t) = /..e-J..t (for t > 0), (10.0.1)
where/,= 1/M [T] is an inverse of the mean value of the interval T.
An ordinary flow of events without an aftereffect is a Poisson flow. An element-
ary flow is a special case of a Poisson flow (namely, a stationary Poisson flow).

X(t)
J

I
'

I l
~ ; J
~
~
L
I
i t
0 ef e2 e., en
Fig. 10.0.2

The intensity'}.. of a flow of events is the mean number (expectation of the number)
of events per unit time. For a stationary flow '), = comt; while for a nonstationary
flow the intensity is generally a function of time; '), = '), (t).
The instantaneous intensity of a flow '), (t) is defined as the limit of the ratio
of the mean number of events which occurred during an elementary time interval
(t, t + l'lt) to the length l:!..t of the interval, as the length tends to zero. The mean

T
· . o ~
--L.'---1@~~~~ ~.---<O>----<~o---oo----{@)1---eo---- t
0 I
'---yr---'I
Ti Tz· T3 T11
Fig. 10.0.3

number of events occurring during the time interval -r immediately following the
to+"
moment t 0 (see Fig. 10.0.1) is a (t 0 , -r) = ) 1.. (t) dt. If the flow of events is sta-

tionary, t~en a (to, -r) = a (-r) = J...-r. to . •


't dAn ordmary flow of events is a Palm flow (or recurrent flow, or a flow wtth a hm-
1 e a.ftereffect) if the time intervals T1 , T 2 , ••• between successive events
(Er Ftg: tp.O.i) are independent similarly distributed random variables. Because
~ t 1te stmilarity of distributions T1 , T 2 , • • • a Palm flow is always stationary.
:~nhelementary
1
flow is a special case of a Palm flow; the intervals of the events in
ave an exponential distribution (10.0.1), where /.. is the intensity of the flow •
. }{_lang's flow of order k is a flow of events which results when an elementary flow
~h ~ mned out", i.e. when every kth point (event) in the flow is retained and all
f c mtermediate points are removed (see Fig. 10.0.3, "here it is shown how Erlang's
ourt 1t-order flow is obtained from an elementary flow).
su In fErl~ng"s flow of order k the time interval between two adjacent events is the
I
ti 0d. k !ndependent random variables T1 , T 2 , • • • , Tk which have an exponen-
a tstrtbution with parameter /.:
k
T= 2j Tt. (10.0 .2)
i=!
326 Applled Problems In Probability Thf!ory

The dtstrtbution of the random vnrtable T 19 known as an Erlang dtstrtbuttcn


of order k (see Problem 8 38) and has a dens1ty
"' (At)Tt-t -1t
1~ (t) = (k- , e ((oT 1 > 0). (t\l 0 S)
11
The mean value~ var1anee and mean square de' 1at1on of the random variable T
(10 0 2) aro
(10 {} 4}
respectn·ely The coefficient of \ariation of the random var1ahla (tO 0 2) IS

Vt = a,lmt == 1/J/k,. (10 0 5)


hence vt -+ 0 ag k -+- oo i e v. hen the order of Er Ian g 's f1ow )ncreasest "the degree
oi randomness' of the 1ntenral bet"\\een the events tends to zero

--~------~-----~----~t
t0 (the presrnt) • till ..

r,g too 5 Flg11 10 0 6

II. along w1tb thtnn1ng out' of an Plementa.ry flowt we also change the scala
along the t ax1s (by d1v1d1ng by k), "e obtatn a standard,zed Erlang
_..
flow of ord~r It
whose 1ntenstty does not depend on k The ttme interval T bet"'een succes5IVB
events 1n a standardtzed Erlang flow of order k has a dens1ty
ftJ. (£) = kA. (kAt)h-t e -I( 7..t (for t > 0) (10 0.6)
~k -i~l I

The numerical characteru~tJcs of the rdndom varlahle


h
,._ 1 ~
T =k L.J Tt
f~t
are

As k Increases the standardized Erlang flow approaehps IndefinitelY a regular


Pow "1th a constant Interval l = i/'A bet\veen tbe events
A stochastic process In a physical system S ts kno\vn as a Afarkov process (or
a proceQs \VI thou t an aftereffect) 1f for any mame11t t() (F1g i 0 0 4) the probab tltty
of o. n]J state of the S.JJSf~m in juture Jfor t .> tJ\) de.,vends onl.JI on ~ts state tn the pres~nt
{for t = tQ.} and does not depend an when and hol' the systtm S reached tkat state (lO
other words for a fixed present the future does not depend on the pre history of the
process, 1 e on the past)
In th1s chapter "e ~hall study only 1\larkov processes "1th discrete states 1 1d
12 • &n It 1~ con' en1ent to Illustrate thts lnnd of process by means of a dtrecte
graph of states (Ftg 10 0 5), "here rectangles (or circles) denotn tho states sl .. 't • •
Ch. 10. Flows of Events. Markov Processes 327

of system S, and the arrows denote possible transitions from one state to another
(only the direct transitions are marked and not passages through other states).
Sometimes the possible delays in a state are also marked on a directed graph using
an arrow ("loop") directed from the state into itself (Fig. 10.0.6), hut this represen-
tation can be omitted. A system may have a finite or infinite (but countable) num-
ber of states.
A Markov process with discrete states and discrete time is usually called a Mar-
kau chain. For such a process it is convenient to consider the moments t 1 , ! 2 , • • •
when the system S can change its state as successive steps in the process, and to

X(t}
[ f
I I
I '? I I
I I I I
I e I I
I I I
I J, I
I I
r I
2 1 I r-
I I I I
1
I ~

.
I

0 t3 t
Fig. 10.0.7

consider the ordinal number of a step 1, 2, ... , k, .•• rather than time t, to
be the argument on which the process depends. In that case the stochastic process
is characterized by a sequence of states
s (0), s (1), s (2), ... , s (k), .•• , (10.0.8)
wfhere S (0) is the initial state of the system (before the first step), S (1) is the state
? the system immediately after the first step, ... , S (k) is the state of the system
Immediately· after the kth step . . . .
The event {S (k) = sl} = {the system is in state si immediately after the kth
step} (i = 1, 2, ... ) is a random event and, therefore, the sequence of states
(10.0.8) can be regarde;l as a sequence of random events. The initial state S (0)
may he either assigned or accidental. The events of sequence (10.0.8) are said to
1orm a Jl[ arkov chain.
Let us consider a process with n possible states s 1 , s 2 , • • • , sn. If we designate
as X (t) the number of the state in which the system S is at a moment t, then the
~o(cess (1\Iarkov chain) can be described by an integral-valued random function
. t t) > 0, whose possible values are 1, 2, ... , n. This function jumps from one
lnf egral value to another at given moments t11 t 2 , • • • and is continuous on the
e t, as depicted by the points in Fig. 10.0. 7.
.Let us consider a univariate distribution of the random function X (t). We
1cs2gnate as Pi (k) the probability that the system S will be in the state si (i =
.' • · · ., n) after the kth step [and before the (k +1)th step]. The probabili-
ttihes P1 (k) are called the probabilities of the states of a Markov chain. It is evident
at for any k
n
Lj Pi {k)=i. (10.0.9)
i=l
Tho probability distribution of the states at the beginning of the process
P1 (0}, P2 (0), · · ·• Pi (0), · • ., Pn (0) (10.0.10)
328 Applled Probl~mg tn Proba.bllltv Theortt

is known as the inlhal probabU~rty d!stributfan of the 1\Iarkov chaJn In parttcular


1! the exact 1n1 Ual state S (OJ of :S} stem S is kilO\\ n say. S (0) = sf' then the rnltua.l
probahlhty PJ (0} = 1 au.d all the other ~n~ttal pro.bahthttes e.te tero
s,
Tho p rob a btllty ()I tra nsthon at the kth ste_E Ironl the state to a state ·') 1s the
eondztz.onal probability th1t the s}stem S wJU be Jn the state 'J alt~r the kth step
prov1ded that tmmed lately he-fore th1s (a Her k - f steps} It ''a~ tn the state I{
A ~larkov chain is s~ud to he hom.ogen.eou1 1f the tran!1tlon prohabtltt1es do
not depend on the ordJnal number of the step and only depend on the steps from
whtch nnd to \\htch the S}stem passes
P { S (k) = aJ J S (k - 1} = s 1} = P , 1 (10 0 f I)
The transJtlon yrobabthhes Pu of a homogeneous l\larkov cha1n form o.n n X fl
square matrrx ca led a trnns1tron ma.trtx
Pu P11 P11 P1n
P~u p2! PIJ P,n
• (10 0 12)
IJPuH= p p 1.} ""
h
,.. .
Pn}

The !Um of the transthon probabJhtJes 1n any ro\v of the 1natr1x 1s equal to un1t1
n
~ P, 1 =t (t=l, ••• n). (10 0 13)
J-=1

A matt! X \vhtch pos3esses th1Q property IS caUed a stochashc matru: The prohah1hty
P ,J 1s the probab1h ty that a syr;;tem \\ btch 1s 1n the sta. te s1 heiore a g1 ven step 1nll
re1na1n 1n tb.at state at a next step
If the 1n1 t1al prohabJh ty dlstrthutton (1 0 0 10) and the ma tr1 t: of trans1hon
proba.bthtv~s (tO 0 12.} are gtven fer a homog~ncou2 !-.!arkov chaJ.n~ then the prob-
abthttes of the states of the system Pi (k) (t ~ 1 2, t n) can be found from
t

the rccurs100 formula


n
p,(k)=LJ PJ(k-t)PJI (i-1. (10 0 t4)
.l~t

For an Inhomogeneous ~farkov eha1n thr:- tran~tuon probab1ht1es tn matrl%


(10 0 12) and formula {tO 0 14) depend on the ordJnal number of the step k
In actual calculahons usJng formula (10 0 14} not all the states ss need be taken
Into account hut only those lor "Inch the tran.sttlon prohah1htJes are nonzero, 1 e
those from which arro" s lead to the state sl on the directed graph of states
A J..farkov process wtth d1scrctc states and contJnuous time is sometimes called
a continuous h1Jrkov chatn For such a process the prollab1hty of trans1t1on from
a state st to a state s1 1s 2ero for any moment In that case~ 1nsted.d of the transitiOn
probabihty PtJ ''e constder the transition probability density 'A11 '\\bich 1s defined
as the hm1t of th-e ra.t1o of the ptobah1ht) oi ttans1t1on trom. the state sl til the
state sJ dur1ng a small t1me lnterv.al 6.t after a moment t to the length of that Inter-
val as 1t tends to zero The transtt1on probah1hty de11s1ty can be e1ther constant
(Au = const} or dependent on ttme (~u ~ ~U (t)\ In the first case a ~farkov sto-
chastic process 'v1th d1screte states and conhnuou.s hma lS sa1d to be homogeneous
A stocbast1c process X (t), wh1ch IS the number- of events Jn an elementary Dow
occurring before the moment t, ts a typtcal example of such a process (see F1g iO 0 2)
\Vhen cons1der1ng stochastic processes w1th dtscrete states and conunuou:s
t1me .. 1t 1s conventent tu a:s.sume that the transitions of a system S from state to
state are affected by some t1ows ol events In that ca~e the dens1hes of trarl~ltlonf
prohab1h.t1es a~,ume the mean.1ng of inteDs1ties AIJ of the conespw.d1ng now o
events (as soon as the first event occurs in the flow w1th 1nte11~n ty "-u· th.e eystem
- Ch. 10. Flows of Events. Markov Processes 329'>

••. jumps from state s 1 to state si). If they are all Poisson flows (i.e. ordinary and without
·-
;• aftereffects, with the intensity being constant or dependent on time), the process-
in the system S becomes Markovian.
~Vhen considering .Markov stochastic processes with discrete states and contin-
uous time, it is convenient to use a directed graph of states on which the intensity-
"• 1.1i of the flow of events which shifts the system along a given arrow is marked he-
t: fore each arrow (Fig. 10.0.8). This type of
directed graph is known as a marked graph.*)
The probability that the system S which
is in a state s1, will pass to a state si during
an elementary time interval (t, t + dt)
{the element of the probability of transition
from s1 to s1) is the probability that during
the time dt at •least one event in the flow
which shifts S from s 1 to si will occur. With
an accuracy to infinitesimals of higher orders
this probability is '"A1i dt.
The flow of the probabilittes of transition
!rom s1 to si is a quantity '"AiiPi (t) (here the Fig 10 0 8
mtensity X1i may either he dependent on · · ·
or independent of time).
Let us consider the case when the system S has a finite number of states s1,.
s2, • •• , sn· To describe a process in the system, we use the probabilities of states:
P1 (t), P2 (t), · · ., Pn (t), (10.0.15).

where Pi (I) is the probability that at a moment t the system S is in lthe state s1:
Pi (t) = P {S '(t) = s 1}. (10.0.16),
Evidently, for any t
n
~ Pi (t)= L ( 10.0.17)-
i=1

. (To find the probabilities (10.0.15), we must solve a system of differential equa-
hons (Kolmogorov's equations) which have the form
n n
dp;(t) " ~ = 1,t2, ... , n),
dt = LJ AiiPi(t)-pi(t\ LJ f..ii (i
i=i j=1

or, omitting the argument t of the variables Pi>


71 71

_dfti =~ l•iiP.il- Pi ~ l•zi (i = 1, 2.... , n) • (10.0.18)·


i=1 i=1

R.emem~er that the intensities of flows l'ii may depend on time t (to make the·
no t ah~n bnefer, we have omitted the argument t).
It Is convenient to derive equations (10.0.18) using a marked graph of states-
of t?e system and the following mnemonical rule: the derivative of the probability
~~ elery _state is equal to the sum of all the probability flows which pass from other states·
6
~ 1
ze gwen state minus the sum of all the probability flows u·hich pass from the given
ate to other states. For example, for the system S whose marked graph of states-

~t ;> .wll shall not draw loops corresponding to delays of the system in a
• a e m the directed graph of states since a delay is alwars _possible.

gtven.
.:330 A p pl1ed Pr ob le m s in Pro bab Utty Th eo
ry

is g1ven 1n Ft g 10 0 8 th e Kol mogorov s eq


ua tlOil.S ar e
dp1/dt = 13tPa + Ao~tP4 - ("-ts + Attl Pt t
dp 3l dt ~ ~I.P1- "--t3P3
+ Aat + A,s5) Pa
dp3/dt == A 2: sP 1- (Ast
(10 0 19
dp,ldt = Ax4P1 + A3-tPa + 1 st F 6 - Ao.Po~
dp ,)d l = A3~P1 - AatP1
Since conditlOn (tO 0 17) 1s sat1sfied fo r an
pr ob ab th t1 es (10 0 15) in te rm s of the y t w e ca n ex pr es s an y one of tb~
11umber of eq ua tio ns by one ot he r pr ob ab ih t1 es anCl th us dlm1n1sh
th•
To solve th e system of dt ff er en tta l eq
-states p 1 (t) p 2 (t) ua tio ns (tO 0 18) for th e probabthtLes o
Pn (t) \ve m us t sp ec if y the 1n ttl al pr ob
ab 1h ty dl!trabutlot
Pt (0) P2 (0) , Pt {0) • Pn (0), (tO 0 2Q
-whose sum is eq ua l to un1ty
ll

L Pl (0} == 1
J~t
If 1n a spec1al case th e st at e of th e sy
-e xa ct ly kn o' vn S (0) = 'l th en p (0 st em S at Lhe 1n1 t1al m om en t t = 0 H
{1 0 0 20) ar e ze ro 1 ) == 1 and th e ot he r tn 1t ta l probab1ltt1e~
ln m an y cases when th e process 10 a sy
be ha vr ou r of th e pr ob ab th tL es p (t) as t-+ st em 1s su ff ic ie nt ly long th e hmlting
1 oo be co m es of 1n te re st If all the flows

s,

F1g too 10
"Of ev en ts whLch sh tf t a sy st em fr om st at e to
Po1s5on processes \Vl th ca ru ta n t 1n tenst tte st at e ar e el em en ta ry (1 e stattonar}'
:etates ar e po ss ib le .. s ).,iJ} th en ltm tt ,,.g
pr ob ab Lh t1es of
p t = ltm p {t) (f = 1 • n) (10 0 21)
t-o o
whteh do no t de pe nd on the st at e lD w ht ch th
Th ts m ea ns th at 1n th.e co ur se of ttm e a e sy st em \vas at th e 1n1tial moment
U m itt ng st at to na ry cond~hon. IS es ta bl is
th e sy st em Th e sy st em st1ll passes he d 1n
from st at e to st at e bu t th e pr ob ab tll t1 es
st at es no lo ng er ch an ge In th1s hnut1nC J' of the
-can be In te rp re te d as th.e relatltJe m ea n°t co nd iti O n ea ch of th e hm1t1ng pr ob ab 1l lt 1es
-state tme fo r ,vh1ch the sy st em st ay s 1n a g1ve
n
A sy st em for 'vh1eb the lu ru ttn g pr ob ab
te rn an d th e co rr es po nd in g st oe ha st te pr 1h tle s ex zs t Is kn ow n as an ergodic sys
Th e cond1t1on )..,11 -== co ns t al on e ts 1nsuoc es s as an ergodl~ proeess
abi!JtLes an d ot he r co nd ttt or u m us t be fu fftc1ent fo r th e ex ts te nc e o! h.m1ttng pro~
.graph of st at es by Js ol at tn g es se nt ta l an dlfilled ,vhich ca n be ve rtf ie d from t e
~ts ts se nt ta ltf th er e ts no st at e s su 1ne3sant1ll st at es Ul 1t The st at e 1i
1 ch th at ha va ng on ce pa ss ed fr om s, to 8J tn soma
Ch. 10. Flows of Events. Markov Processes 331

way, the system cannot return to si. Every state which does not possess this pro-
perty is inessential.
For example, for a system S, whose directed graph of states is shown in
Fig. 10.0.9, the states s1 , s 2 are inessential (the system can leave state s1 , say, for s2
and not return, and leave s 2 for s 4 or s 5 and not return), whereas states s4 , s 5 , s 6
(II~
and s7 are essential.
For a finite number of states n, for limiting probabilities to exist, it is necessary
and sufficient that it should be possible to pass from each essential state (in a certain
number of steps) to every other essential state. The
ph graph shown in Fig. 10.0.9 does not satisfy this ,1.
'•h 21
u,l; (:ondition (for instance it is impossible to pass from
the essential state s 4 to the essential state s6 ).
Limiting probabilities do exist for the graph
shown in Fig. 10.0.10 (it is possible to pass from
every essential state to every other essential state). itrz
Inessential states are thus called because sooner
{)r later the system will leave these states for one .1. 13
{)f the essential states and will not return. The
limiting probabilities for inessential states are
naturally zero.
If a system S has a finite number of states
I
.s,, s2, ' ••• , sn, then, for the limiting probabilities
to exist, it is sufficient that the system could pass
I~
from any state to some other state in a certain number
?f ~teps. If the number of states s1 , s 2 , ••• , sn, ,.••
Is ~nfinite, then this condition is no longer suf-
ficlent, and the existence of limiting probabilities Fig. 10.0.11
<l epends both on the directed graph of states and
{)n the intensities 'J... ••
. The limiting prohibilities of states (provided they exist) can he obtained by solv-
mg a system of algebraic linear equations which result from the Chapman-Kolmo-
~o~ov differential equations if their left-hand sides (derivatives) are set zero. But
It IS more convenient to derive these equations directly from the directed graph
<lf states using the mnemonical rule: for every state the total outgoing flow of proba-

Fig. 10.0.12

.fiilit~es is equal to the total incoming flow. For example, for a system S, whose directed
marb·ebd.f:fr~ph of states is shown in Fig. 10.0.11, the equations for the limiting
pro a ilthes of states have the form
(Au+ 'Au) P1 = 'Aupz, (Azt + /..21) Pz = 'J..12P1 + l..~zP-t· {10.0.22)
{A 31 +J.3s) lp3 = 'Ar3Pt• 'A.:zP4 = 'A.z~Pz + I.MP3 + 'As4P5• l.s4Ps = !..35P3
:rh~s, for a system S with n states, we obtain a system of n homogeneous alge-
talc hnear equations in n unknowns p 1 , p 2 , ••• , Pn· This system will yield the un-
th owns P1• p 2 , • • • , Pn'!with an accuracy to within an arbitrary factor. To find
dJl· exact values of p 1 , • • • , Pn• we must add to the equations the normalizing con-
.i lon Pr + Pz + ... + Pn = 1. We can then express each of the probabilities Pi
n trrms of the other probabilities {and correspondingly delete one of the equations).
st t n practical applications we often encounter systems whose directed graphs of
a a es. have the form shown in Fig. 10.0.12 (all the states can he extended to form
~d~hnm, each state being in a relation, both forward and backward, with the two
Jacent states except for the two e-..:treme states, which have only one neighbour).
332 Appl td Problems 111 Probablllty Theory

The chain sho,vn 1n FJg tO 0 ;2 IS kno\\n as the birth and death prtJCe$t The11e
terms have heen borrowed from hlolog1cal problems where the .state of a popula
t1on ..rlllHgnlfies 0the pre.sence of k un1ts 1n it A transJtJon to the r1ght IS connected
'vtth the b1rtb of un1ts and that to tho left Vtlth tbe1r ...death~ In FJg iO 0 12
the birth rates f'. 0 i 1 An 1) are put h~side the arrow-s lead1ng from left
to r1g ht and the de a th ra te.s"' (~ 1 p. 2 ~ n 1 ) are put hes1 de the arro'\\'! Ieadtng
from r1g h t to leit ca ch oI them has an index of the state from which the corre~pond
fng arrO\V leads
For a h1rth and death process the hnutlng probahllltte s of states are e"tpre ~ed
by the formula!l

'
Ani.. 1 An 1
Pn ::;::; 1-' 1Jt:a Jln Po

Po- ( 1 -L ~: + ~:~: + + ).~~~~ >-;n 1


r I• (10 0 23)

Pay attentton to the rule o! ca.lculattng the probab1hty of (a state (for k =


2 n)
tr,;, 1 i. :t. -1
Pk := Jl-t ~~ J.lk Po
wh1ch can be formula ted as follo'\\o 13 tht proba bllfty of a state ln a birth and death
proctss (sec F1g tO 0 12) is the product of all tht birth ralts to ~he left of 'k d~tnded bg
the product oj all the dto. th rates to the l~fJ of sit tht frnet ton btlng multiplltd by 1M
probabthty of lhe ext eme l~ft statt Pe
If a process l9 de~ Ctl bed h y the h1rth and d ea. th ch a 1n we can "'1t e d1ffere ntlal
equations for the mean value and var1ance- of the funchon X (t) 1 e for the number
of un1 ts 1n the system at a moment l

(10 0 2-t)

n
(d Var {t
dt =~ (10 0 25)
h-0
In these formulas "e lllU5t set An. = Jlo = 0 The Intensthes Ah. (0 ~ k ~ n - i)
and Jl.t (i ~ k ~ n) can be any nonnega t1 ve functions of t1me
\Vhen the values of mx (t) are sufficiently large (>20) and the condition 0 <
mx- (1) ± 3 l Var~ {t) < n 1s satisfied n-e can approxunat ely assume that th&
section of the random functlon X (t) 1s a normal random varlable "\\ 1th parameters
mx (t) Yvar.-t(t} wh1ch "ere ohta1ned by solv1ng equations (10 0 24) and (tO 0 25)
Formulas (J 0 0 24) and {10 0 25) rema 1n vahd as n-+ oo 1f the upper hm1t 1n the
sums 1s replaced by oo

Proble1ns and Excrclses


10 1 T,,o elementar y flows are super1mpo$ed flow 1 w1tb 1n
tens1ty A1 and flo~ II w1th tntens1ty A~ (F1g 10 1) Is flow III
result1ng from the supe-rpos1t1on elementar y and )f 1t 1s. '\Vhat lS tts
1ntens1t)?
Ch. 10. Flows of Events. Markov Processes 333
....

Solution. Yes, it is since the properties of stationarity, ordinariness
!· and absence of an aftereffect are conserved; the intensity of flow III
j is equal to A-1 + '-2·
:1 10.2. An elementary flow of events with intensity A. is thinned out
..: at random. Each event, independently of the other events, may be
I

.-

I 0 I I
>----.----- - - - - - - - t
I I
I I l 1 I
1 l
I I I
--i-'~-t
I
I
I

Fig. 10.1

conserved in the flow with probability p or removed with probability


1 - p (in what follows we shall call this operation the p-transfor-
mation). What is the flow resulting from the p-transformation of an
elementary flow?
Solution. The flow is elementary with intensity A.p. Indeed, under a
p-transformation all the properties of an elementary flow (stationarity,
ordinariness, absence of an after-
:ffect) are conserved and the intensity f(t)
IS multiplied by p. il. - - - -
10.3. The time interval T be-
tween events in an ordinary flow
has a density
[
l.e-i.(t-to) for t > t 0 ,
I (t) = l
0 for t < t 0 •
0
t
(10.3)
The intervals between the events Fig. 10.3
are independent. (1) Construct a
graph of the probability density j (t). (2) Is the flow elementary?
(3) Is i~ a Palm flow? (4) What is its intensity 1:? (5) What is the variation
coefficient v1 of the interval bel ween the events?
So~utio~. (1) See Fig. 10.3. We call a distribution of this kind "expo-
nential, displaced by t 0". (2) No, it is not, since distribution (10.3) is
~ot ~xponential. (3) Yes, it is, because of the ordinariness of the flow,
Ie ,__Independence of the intervals and their similar distribution.
( ) i. = 1/M [T], M [T] = 1/'A + t 0 , ~ = (1/'A
4 + t 0 )- 1 = 1./(1+/,t 0 ).
(5) Var (T] = 1 _ 1 crt 1/'). _ 1
'j.2 ' at- 1. ' Vt = M [T) 1/l.+to - 1+1.t 0 •
334 Applied Problem1 fn Probab lfty Theory

tO 4 There 1s 1n eletnen tary flO\\ of events 'v1th tntenstty 1 on the


t ax1s Another flow IS formed from 1t by tn~ert1ng another event at the
m1dpo1nts of the 1nterval bet\\ cen e\ ery l\\ o adJacent e' ents (~ee
Ftg 10 4 'vhere the Circles denote the pr1nctpal events and the crosc=es
denote the IO~erted Ones) rtnd the diStribUtt on denstty f (t) of the

0
rlg 10 li

Inter\ al T bet" eon adJacent c' cnts Ill the ne'v flo'v Is the flo" elemen
tary? Is 1t a Palm flow? ''hat IS the cooffictent of Yar1a lion of the 1nter
val T bet" con the e\ onts~
Solution T - Ji./2 ''here X has an exponenti al distrlbutto n '\\llh
parameter A / 1 (x) = A.e "'x (x > 0) Us1ng the solutton of Problem 8 1
and setttng a = 1/2 b = 0 In formula (8 1) \Ve obta1n
f (t) = 2Ae- 2'- i (t > 0) {tO 4)
Th1s ts an exponenti al dtstrtbut1 on "1th a coefficient of var1atton
v, ~ 1 Neverthele(O:.s the new flo'\\ 1s not elementar y nor even a Palm
flow \\ e shall first pro' e that 1t IS not a Palm flov. Though the In
tervals bet" ecn the e\ ents ha' e the '=a me distributio n (10 4) thel are
not Independe nt Let us constder t'\\O adJacent 1nter\als They may
be Independ ent'' 1th probab!ltt y 1/2 or equal to one another '' 1th prob
abiltty 1/2 and consequen tly dependent Thus the now flow oi events
ls not a Palm flo" It 1s naturally not elementar y stnce an elementary
flow ts a spectal case of a Palm flow
Thus an exponenti al distributio n of
an tn terval bet'\\ een events lS not a
suffictent condttton for a f]o'' to be
elementar y
10 5 The hypothesi s 1s the same as In
Problem 10 4 except that the new flO\\
cons1sts only of cro~ses (the m1dpo1nts of the Inter' als) ~ Ansv,er
the same questions as 1n the preceding problem
Solutton The Intensity of the ne\v flo'v ,viii not ev1dently change as
compared to the Intensity of the original flow and will rema1n equal to
A The Interval bet~een t'\\ o neighbour ing crosses (F1g 10 5) Is
T (Xi + X l+ 1)/2 1 (10 5)
where xi and xf+l are tVvo adJaCent Intervals lD the orrgtnal flow The
v ar1abl es X 1 and X 1+1 h oth b a' e an ex ponentral dJstribu t1on w1 th
parameter A and half theJr sum (10 5) \VIll ba\e a standardized Erlang
dtstrihntio n of the 2nd order since the tnte:r' al T lS equal to the sum
of two Independe nt exponenti ally distribute d random variables d1v1ded
by2 two Thus the Interval T between two crosses has a density I (t) =
4"'- te - 2"-t ( t > 0)
Ch. 10. Flows of Events. Jlfarkov Processes 335-

ln the transformed flow all the neighbouring intervals are dependent


since their components are the same random variables. However, this-
dependence involves only neighbouring intervals. Flows of this kind
are sometimes called flows with a \veak aftereffect.
The coefficient of variation v1 for the random variable T is {see-
formula (10.0. 7)] Vt = 1tV2 < 1.
10.6. The traffic on a road in the same direction is an elementary flow
with intensity A.. A certain Petrov stands on the road and tries to stop·
a car. Find the distribution of time T for which be will have to wait~
its mean value m 1 and mean square deviation CJt.
Solution. Since an elementary flow bas no aftereffects, the "future''"
does not depend on the "past", in particular on the time when the last
car passed. The distribution of time T is the same as the distribution.
of the time intervals between successive cars, i.e. exponential with
parameter A.: f (t) = A.e- i.t (t > 0), hence m 1 = 1/A., Var 1 = 1//...2 ,.
<J1 = 1/'A = mt> vt = 1.

Remark. If the traffic on a highway is in more than one lane, it can be consid-
ered to be a superposition of several flows, each corresponding to one lane. If each,
Opw is elementary, then the result of the superposition is also an elementary flow
smce the properties of stationarity, ordinariness and absence of an aftereffect are-
conserved under a superposition (see Problem 10.1).
10.7. A passenger arrives at a bus stop irrespective of the time-table.
The buses arrive regularly with an interval l. Find the distribution of
~ime T for which the passenger will have to wait for a bus and express.
Its characteristics mt and CJ 1 in terms of the intensity of the traffic A..
Solution. The moment the passenger arrives at the bus stop is dis-
tributed with a constant density within the interval l between two-
buses. The distribution density of the waiting time T is also constant
Ia uniform distribution on the interval (0, l)]: f (t) = 1/l (0 < t < l)t-
or; de~ignating 1/l = A., j (t) = A, (0 < t < 1/A.). For a uniform dis-
tributiOn on an interval of length 1/A. we have m 1 = 1/(2/...), Var1i=
1/(12/-2), CJ 1 = 1/(2 )/3!.), Vt = 1/y3.
10.8*. There is a Palm flow of events on the t-axis, the intervals-
bet,veen which are distributed with density f (t). A point t* is thrown.

T*
"
l
0
• • )(
tlf
~ .... t

Fig. 10.8

at random on the t-axis (say, an "inspector" arrives to verify the occur-


re~ce ~f events, or a "passenger" arrives at a bus stop), the moment t*
~:~ng m no w~y connec~ed .wit~ the occ:urrence of ~he events _in the flow·
lh Ig. 1~.8). Fmd the dtstnbutiOn density of the mterval r~ on which·
e _po_mt T* has fallen, its mean value, variance and mean square-
!Ie\'Iatton.
.:338 App/led Problems .rn Probabiltty Thtory

Solution It seems at first s1ght th 1t thls dens1 l) IS the same as the


dtstribut1on denstty I (t) of any tnterval T be tv. een the events It l.9
not, ho\ve' er the ca.g;e That a potnt t* thro,vn at random has fallen
-on the Interval T* change~ 1ts dlstrlhutlon Indeed, 1f there are a
var1ety of 1ntervals (largo and small) on the t axts, then there is a
.greater probability that the potnt t* \vlll fall on the larger tnterval
Let us f1nd the dtstr1button dens1t) /* (t) of the 1nterval T* on ~htch
the p01nt t* hns fallen \Ve seek the element of probabtltty /* (t) dt
~qual to the probab1l1ty that the po1nt t* Will fall on the Interval v.ho~e
length 1s 1n the range (t t + dl) Th1s probahlltty ts approximately
~qual to the ratto of the total length of all such Intervals w1th1n a very
large Interval Of time Q to the total length of that lnlerval
Assume that a large number /'{ of 10 terval q bet \veen the events fit
tn a ' ery large 1n terv a 1 Q The average n u mher of 1 n ter\ als, "ho~e
iength ts 1n the range (t t T dt) 1s l'r f (t) dt, the average total length
.of all such •nter\ als 1s approximately tNj (t) dt The average total
length of all tho N Intervals Q 1S (a pproxtmatel y} Nm,, 'vhere mt =
00

M [T] = ~ tj (t) dt On 1dmg one by the other, we get


(J

!* (t) d t ~ A t I (dl dt ~ t I (l) dt


Nmt mt
The approximation beeomes the more exact the longer the Interval
~~ t1me Q \Ve consider (the larger N) In the llmlt, the dtstributlon of
the random vartable T* IS
1
f* (t) = Tnt
t (t) (t > 0} (10 8)
00

:r-.1 [T*J = f
mt ~
\ f 1
'/ (t) dt =
1
mt
aJ (t) =
1
m1
(Vart + m~),
0
00

Var IT*]= ct2 (T*]- (At [T*])2 = t


ma
rJ t3 I (t) dt- (ill [T* na
J 0
10 9 On the hypothests of Problem 10 8 Palm's fio\V ts an elemen-
tary flo'v Wtth 1ntens1ty 'Jw 1 e f (t) = Xe Xt (t > O) Ftnd the d1str1
but1on density /*(t) of the Interval T*
on '' hich the potnt t* 'vill rall
Solut1on Stnee m 1 = 1/.A~ formula
H (10 8) y1elds f* (t) = A2
te-tt (t > O).
''htch IS none other than Erlang s
F1g 10 10 dtstrtbut~on of the 2nd order [see for
mula (10 0 3) for k = 2]
10 1 0*. There IS a Palm no,v of events ,v1th d ens1 t y f {t) of the Inter-
-val T bet\veen successtve events on the t axis A random pornt 1*
'(an 'inspector') falls somewbore w1th1n the 1nterval T* (F1g 10 10)
It d1v1des the 1nterval 1nto t'vo subintervals Q, from the nearest pre
v1ous event to t*., and H from t* to the nearest succe~stve event Ftnd
the dtstrl bu tions of the t \VO 1n t erv als
Ch. 10. Flows of Events. Markov Processes 337

Solution. Let the random variable T* assume a values: T* = s, and


find the conditional distribution of the interval Q under this con-
dition. We designate its density as fQ (t Is). Since the point t* is thrown
on the t-axis at random (irrespective of the events of the flow), it evi-
dently has a uniform distribution within the interval T* = s:
fQ (t Is) = 1/s for 0 < t< s. (10.10.1)
To find the marginal distribution f Q (t), we must average the density
(10.10.1) with the "weight" f* (s). Using formula (10.8), we obtain
00

f* (s) = s-
lnt
f (s) . /Q(t)= J/Q(tis)f*(s)ds
0

Taking into account that f Q (t I s) is nonzero only for s > t, we can write
00 00
1 1
/Q(t)=
Jf s f(s)ds=
smt mt
IJ f(t)dt=
mt
[1-F(t)],
·
t t
where F (t) is the distribution function of the interval T between the
events in the Palm flow.
Thus we have
1
fQ (t) = mt [1-F (t)]. ('10.10.2)

It is evident that the time interval H = T* - Q has the same dis-


tribution:
1
fH(t)= [1-F(t)]. (10.10.3)
mt
10.11. Using the results of the preceding problem, verify the solution
of Problem 10.6, which we obtained from other considerations.
Solution. We have f (t) = A,e-1-t, F (t) = 1 - e-7-.t (t > 0), mt =
111.. By formula (10.10.3)
fH (t) = '" [1 - 1 + e-'· 1
] = A.e-7-.t (t > 0),
i.e. the solution of Problem 10.6 is correct.
10.12. A passenger arrives at a bus stop irrespective of the time-
table. The flow of buses is a Palm flow with intervals uniformlv dis-
tribute~ in the range from 5 to 10 min. Find (1) the distribution d~nsity
of ll~e lnlerval during "·hich the passenger arrives; (2) the distribution
dens~~~' of time H for which he will have to wait for a bus; (3) the aver-
age lime he must wait for a bus.
Solution: We lHl.ve f (t) = 1/5 for t E (5, 10), mt = 7.5.
II (1) B~: 1ormula (10.8) f* (t) = t/37.5 for t E (5, 10). The graph of
lt' densrty f* (t) is shown in Fig. 10.12a.

0 for t~5,
(2) F (t) = (t- 5)/5 for 5 < t~ 10,
1 for t > 10.
338 Appl!td Probl~ms in Probablllty Thtorv

f*(t)

---t----.&;...,;~~-t
{1 ; 10 0 5 10 t
fa) {hJ
Fig 10 12

(3) The a' erage 'vatting t1me


5 10
mil= ) 1t 5 dt + ~ t 1~.;-;t dt ~611 m1n
0 1
10 13~ \Ve constder Erlang•s no,v of order h With a dtstrJbUtiOD density
of the Inter\ al T bet" een events
1
•. .
.-.-_ ___.A:------.
I" (t) = ,. (~t}Jt.-l -:lt (t > 0) (10 13 t)
0 v-....;; l '• t
(k- t)l e
t
F•g 10 13 Find the dIS tr1bu tton funct1on F k ( t) of
thts Interval
Sol ut1on \Ve caul d find the d 1s tri bu tion func lion ns1 ng the ordi-
nary formula
i

F,.(t)""' ) Ill (t) dt,


0

hut 1t IS s1m pler to find 1t proceed tng from the d efin 1t 10n FA (t} =
P{T < t}
'' e pass to the oppos1te event and find P { T > t) \\ e associate the
1

or1g1n 0 '' 1th one of the events 10 Erlang's flo'' and lay off tv. o Intervals
to the r1gh t of 1t T (the d 1stance to the ne"'\. t e'en t 1n Er langts flo,v)
and t < T (Fig 10 13)
For the Inequality T > t to be sattsfied, It ts necessary that less than
h. events 1n the elementary flow w1 th 1nt ens 1t y 1 fa II on the 1n ter' al t
{c1ther 0 or 1, , or k - 1) The probe btltty that m events fall on
Ch. 10. Flo ws of Ev ent s. Ma rko v Pro ces ses 339

the interval t is
- (A.t)m - M
Pm - ml e .

By the pro ba bil ity ad dit ion rul e


k-1

P {T > t} = ~ ('A~~ e-~t,


m= O

whence
k-1

Fk (t} = 1 - ~ (J.t)m e-~t = 1 - R (k -1 , /.t), (10 .13 .2)


ml
rn= O

wh ere 1 - R (m , a) is a tab ula ted fun cti on (se e Ap pe nd ix 2).


w of co mp ute r fai lur es is Er lan g's flow of ord er k
10.14*. Th e flo
de ns ity (10 .13 .1) of the int erv al be tw ee n the fai lur es (w he n the
with the ly} .
wn , it is bro ug ht ba ck int o op era tio n ins tan tan eo us
computer goes do st
cto r" arr ive s at a ran do m mo me nt t* an d wa its for the fir
An "inspe
ure. Fin d the dis tri bu tio n de ns ity of the tim e H for wh ich he wi ll
fail
have to wa it for a fai lur e an d its me an va lue mn.
Solution. By for mu la (10.10.3)
1
fH (t) = mt [1 -F Jl{ t)] ,

where Fh (t) is giv en by for mu la (10 .13 .2) an d mt = k/1.. He nc e


k-1 k-1
t) = A. ""' (/.t )m _,. , _ 1 -v 'A ('At)m e-~t (t > 0). (10 .14 .1)
fH ( k Ll ml e - k Ll [ml
m= O m= O

We rewrite for mu la (10.14.1) in the for m

A. p.t)T-1 e-M (t > 0). (10 .14 .2)


(r- 1) !
r=1
d"
I~ ca~ be seen fro m (10 .14 .2) th at the ran do m va ria ble
of
H
dif
ha s
fer
a
en t
"m ixe
ord ers ;
?Js tnb uti on co nsi sti ng of k Er lan g's dis tri bu tio ns
dis tri bu tio n of ord ers 1, 2, .. . , k wi th eq ua l pro b-
lt .h?s Er lan g's
1/k . Th e me an va lue of su ch a ran do m va ria ble ca n be fou nd
;hl hty
rom the co mp let e ex pe cta tio n for mu la
k

mH = Ivi [H ] = ~ ~ M [H I r], (10 .14 .3)


r=1

~here ~~ lH l r] is the co nd iti on al ex pe cta tio n of the ran do m va ria ble


prov1ded tha t it ba s Er lan g's dis tri bu tio n of the rtb ord er.
340 lpplitd Probltms In ProhablUtrt Thtory

t rom the ftrst formula (10 0 4) '' e find that l\I [H 1r] = ri'A, "hence
k

mn ==
f "'
k~ LJ r -
(k+f)'
2k~ =
k+ f.
2X: " (10 14 4)

10 15* A Palm flow of events \Vtth dtstr1bnt1on denstty I (t} for


the 1nterval bet\veen the events 1s subJected to a p transformat1on (c:.ee
Problem tO 2) A random varaable V ts the tnterval between the events
1n the transformed flow Ftnd the mean 'alue and vartance of the
random var1able V
Solut1on Tho random vartablo V ts the sum of a random number
y
of independent random v1rtablcs (see Problem 8 63) V = ~ T~
1 1
;!Ill;

\vhere Y IS a dtscroto random var1able 'vb1ch has a geometrtc dts-.


=
tr1but1on P {Y == m} pqm-t (m= 1 2, ) q= 1- p and each of
the random vartables T" has 1. d1str1bUt1on f (t)
Then the succcsstvc Intervals between thee' ents tn tha p-translormed
flo'v are

where the random var1ables V1 V 1 are dtSJOlnt and the trans-


formed Oo\V is a Palm flow In accordance ''1th Problem 8 63.,

\vhere

Jtf(t) ell, J(t -m


00 QC

m, = Var 1 = 1)" f(t) dt


0 0

Remark \Ve caD prove that a mult1ple p transformatton of a Palm Dow re.sult!
in a flOW Which IS elose to an elementary now
10 16 Ftnd the dtstrlbulton of the Interval T bet\veen the events
1n a Palm flow 1f the random var1able T can be determtned from the
y
express•on T = ~ T, 1 a 15 the sum of a random number of random
h--i
terms where the random vartables Tk are Independent and have an
exponential d1strlbution 'v1th p1rameter 1 and the random var1able Y
does not depend on them and has a geometriC dtstributton begtnntng
WJth un1ty Pn = P {Y = n} = pgn 1 (0 p < <
1, n = 1 2 3 )
Solutaon As shown 1n Problem 8 62,. the random var1able T has an
exponential dastrtbutton 'vtth parameter ~p and consequently the
Palm flow be1ng cons1dered 1s an elementary flo\V wtth tntenstty ~p,
'vhtch results from a p transformation of an elementary fioW With 1n
tensity 7v Thts confirms the correctness of the solution of Problem 10 2
Ch. 10. Flows of Events. Markov Processes 341

10.17. The flow of buses arriving at a bus stop is a Palm flow, and
the interval T between each two buses has a distribution density f T(t).
A bus stays at the stop for a nonrandom time -r. A passenger arrives at
the stop at a random moment t* (Fig. 10.17a). He boards a bus if it is
at the stop and waits for a time -r' if there are no buses, and, if no
buses arrive at the stop during that time, leaves the stop and walks

r,
1\

7: t~ r' 7:
(a)
Pencd for Pertod for Perw:i for
r::ssengers vassengers passengers
!L! b:ord t,; !J.Jard tc .fcord
~::~;::;
>QL-~-;¢
r' -r T' -r
(c)

Fig. 10.17

to his destination. The distribution fT (t) is such that the random variable
T cannot be smaller than -r +
-r' (Fig. 10.17b). Find the probability
that the passenger will take a bus.
Solution. We consider the opposite event A ={the passenger "\Vill
not catch a bus}. This means that when the passenger arrives at the
stop at a moment t* there are no buses at the stop, and no buses arrive
~uringthe period he waits. Each event, i.e. the arrival of a bus at the stop,
lS f.ollowed by a period for the passenger to board, the width of the
renod being T' +
-r (Fig. 10.17c).
The .event A= {the passenger does not catch a bus} corresponds to
t~e pomt t* falling outside the boarding period (Fig. 10.17d). The point
t has a uniform distribution owr the whole length of the interval T*.
!he pr?bability that it will fall on the interval T* - (-r
15 0
-r'), which +
\Itside the boarding period, is (by the integral total probability
f ormula)
co 00

P(A)= \ f*(t)dt= \ t f(t)dt


~ ~ mt
l+'t" -r+•'
00

~t
't+t'
1 if (t) dt- '!;' j
't+'t"
f (t) dt.
1
"' r{e mt is the aYerage interYal between buses.
le probnbility tlwt the passenger will catch a hus is P (A) =
1- P (.4). .
!~.18.
l OWin t
An elementarY flow with intensity /.. is subJ·ected to thn fol-
• • "
g ransformation. If the distance between adjacent events T 1
3'2 A pplitd Probltmr fn ProbabllHg Thtory

1s Cimaller than some perm ISStb le l1 m1t t 0 (a ' sa.fet Y tn tervar•), then
tho e' ent 1s displaced by an 1nterval t 0 relat1 ve to the preced1ng event
NO\\ 1f T f > t 0 • then tha event rem a 1ns put {F tg l 0 18) Is the trans-
formed flo\Y formed by the potnts 9i, 9 2, •• ~~, on the t' -ax1s elemen
tar:-,.? Is It a Palm no,v?
Solution. The transformed flo'v 1s ne1tber elementary nor Palm's
s1nce 1t has an aftereffect that can e"ttond arb1 trarll} far For example,

0 e, 82 9J e~.~ 85 BG 6, Ba 9s @i.i 911


I • It
'I l
l
I
I
I
I I
I I l I
0
I lo
_............__ I to lu t to fo J
__...___~ ~ ..-""--. ~
lr
e'I Oz e; 8~ 9'5 8'5 e'l e; 8.9 e,a eft
Fig 10 18

the points 81 Oa e!h al 0 CfO\Vd the t' -a '\:IS and so e\ ery subsequent
potnt on the t a "tts ts sh1fted by a t1me 1nterval \Vhtch depends both
on when the events occurred and on the Inter' als bet,,een them 1n the
past If t 0 ts much less than the a \erage d 1stance between the events
10 the orJgtnal flol.v 1 e t 0 < tli... then 've can neglect t1te s.ltereiiect
in the transformed tlow
Tr
A
..
- •1
;
, I ,.~

'I ' Tg
l
I
l 'I 'II
J

- I
I
T~
t
j
....' f I •I I
• .~
I t J 1I
I l r, I r2 11jl T.
A4 ! 1 II
l
--·iF : -.~.-·,.. ~H~~-.l

Fig 10 19
10 19 T'" o Independent Palm flo\VS \VIth d1str1hutton denslttes
It (t) and /r (t) for the Interval bet,veen the e\ ents are superimpo<=ed
Is the resultant flo\v a Palm Do'v'
Solutton The superpos1t1on of t'vo Palm flo\vs 1s shown 1n Ftg 10 19
It Is clear that the Intervals T 1 , T 2, of the resultant flow III are
not Independent stnce thetr stzes are defined by those of the same
Interval on the a xts I or I I For 1nstance, T1 and T2 add together to
gtve T 11 and. hence, they are not Independent Thts relattonsbtp 19
rap1dly damped, ho\vever, as the pertod between the or1gtns of the
Intervals 1ncrea~es

Remark. \Ve tan prove that when a suffi.e1ently large number of Palm's O.om
wl th comparable in ten! 1t 1es are su per1 m pas ed. a no,v re9ul ts w hlch is cl o.s e to an
elementary flo\v
Ch. 10. Flows of Events. Markov Processes 343

10.20. A computer, when it is running, can be regarded as a physical


system S which may be found to be in one of four states: (s1) the com-
puter is completely sound, (s 2 ) the computer bas a negligible number
of faults in the memory and it can continue to solve problems, (sa) the
computer bas a considerable number of faults, but it continues to soh·e
a limited range of problems, and (s 4) the computer goes down com-
pletely.
Initially the computer is completely sound (state s 1) and is checked
at three fixed moments t 1 , t 2 , ta. The process in the system S can be
regarded as a homogeneous Markov chain with three steps (the first,
the second and the third check of the
computer). The transition probability
matrix has the form Sr
0.3 0.4 0.1 0.2 0.2
0.1
0 0.2 0.5 0.3
IIPiJII= • 0.8
0 0 0.4 0.6
0 0 0 1.0
0.3
Find the probabilities of the states of
the computer after the three checks.
Fig. 10.20
Solution. The directed graph of
the states of the computer is illus-
trated in Fig. 10.20. Each arrow shows the transitional probability.
The initial probabilities of states p 1 (0) = 1 and p 2 (0) = Pa (0) =
P4 (0) = 0.
~~i?g formula (10.0.4) and taking into account in the sum of prob-
abihties only those states from which a direct passage to a given
state is possible, we find

Pt (1) =
p 1 (0) P 11 = 1 X 0.3 = 0.3,
P2 (1) = p 1 (0) P12 = 1 X 0.4 = 0.4,
Pa (1) = p 1 (0) Pta = 1 X 0.1 = 0.1,
P4 (1) = p 1 (0) P 14 = 1 X 0.2 = 0.2,
Pt (2) = p 1 (1) P11 = 0.3 X 0.3 = 0.09,
P2 (2) = Pt (1) P 12 + p2
(l) P 22 = 0.3 X 0.4 + 0.4 X 0.2 = 0.20,
P3 (2) = Pt (1) Pta + p 2 (1) P2a + Pa (1) Paa = 0.27,
P4 (2) = Pt (1) P14 + P2 (1) P24 + Pa (1) P34 + P4 ('1) P44 = 0.44,
Pt (3) = p 1 (2) P 11 = 0.09 x 0.3 = 0.027,
P2 (3) = Pt (2) P 12 +p 2 (2) P 22 = 0.09 X 0.4 + 0.20 X 0.2 = 0.076,
P3 (3) = Pt (2) Pta + P2 (2) P2a + Pa (2) Paa = 0.217,
p~ (3) = Pt (2) P14 + P2 (2) P24 + P3 (2) Pa4 + P4 (2) P u = 0.680.
3~-~: ~ p p/fl!:d P roble m.1 i11 Prp&a billt y Theory

Thus the probab 1l1tles of states of the compu ter after three checks
are p 1 (3) 0 027, p~ ( 3) ~ 0 076. p 3 ( 3) = 0 21 7. p 1 ( 3) = 0 680
;;::=

10 21 A po1nt S \valkc; along the x n'ttS (F1g 10 21a) so that at


each step tl rematn s put \\ llh prob1b lltty 0 5, JUntps to the r1ght by
un1ty ~ 1th probab tltty 0 3 and JUmps to the left h} unitl '" 1th prob-
ablltty 0 2 The state of the S} stem S after ! steps Js def1ned by one
coordtnate (abscts sa) of tho p01nt S The in1 t1 ,1 pos1 lion of the po1nt
IS the orJgtn Con~1der1ng thn ~cquence of positJons of the po1nt S
3

3
1 l
2 1 0
l...---
1 -·2 I I d

J
f I I
X

(a)

05 a5 os a5 os

02 Ol 02 02 C"
(6)
r1~ to 21
to be a \farh.ov cha1n find the prob1.hil1t~ that after four steps tt \\dl
he no farther than a untt} from the or1g1n
Solut1o n \\ e des1gna te the st '\te of the system (pOint S) as s1 '\vhere
l IS the coord 1na te of S on the abqc ISS 1 a""'= u; The marJ..ed graph of states
IS sho\\ n 1n Ftg 10 21 b (For the ~ake of clartt} , loops are sho" n tn the
figure 'vh1ch corresp ond to the delal- ~ of S tn the prev1o us pos1t1on)
The sequence of states forms a l\1 'lrko\ cha1n 'v1th an 1nfintte number
of states The trans1t1on 1l probnbl11 t1cs P 11 are nonzer o onl' for 1 = i,
i = ' - 1, I == t + t P, l 5 P1 t+l =o 3 p~ 1_ 1 = 0 2 All=o
the other transit ional probab1 l1tJes are zero The requ1re d probab 1ltty P
IS equal to the sum of probabtl1t1es Po (4) +
p 1 ( 4) _L p_1 (4) '\ e can
find them \lslng the recurre nce- relat1o ns (10 0 14)
\\ e ha\ e Po (0) = 1 p 1 (0) = p 1 (0) = = 0 Furthe rmore

Po (1) =Po (0) Po 0 = 0 5 p 1 (1) =Po (0) P 11 1 = 0 3~


P-t (1} ~ Po (1) P0 _1 = 0 2
= Po (1) p o o + P1 (1) Pt o + P-1 (1) P-1 o = 0 5 X 0 5
ro (2)
+ 0 2 X 0 3 0 3 X 0 2 = 0 37 +
P1 (2) - Po (1) Po 1 + p 1 (1) P 1 1 = 0 5 X 0 3 + 0 3 '< 0 5 = 0 30,
Pa (2) = p 1 {t) P 111 = 0 3 X 0 3 = 0 09,
P-t (2) =Po (1) Po-t +
p 1 (1) P _1 _ 1 = 0 5 X 0 2 0 2 X 0 5 = 0 20, +
P-~ (2) = p_1 (1) P -t _ 2 = 0 2 X 0 2 = 0 04,
Po (3) = Po (2) Po o + P1 (2) P1 o + P-1 (2) P -t o = 0 305,
s 345>
Ch. 10. Fl ow s of Ev en ts. fif ar ko l' Pr oc es se

, + p (2 ) P + p (2 ) P ,1 = 0. 27 9,
p 1 (3) = P o (2 ) P 01 1 1 , 1 2 2

p2 (3 ) = p 1 (2 ) P 1 , 2 + p 2 (2 ) P 2 , 2 = 0. 13 5,

P3 (3) = p 2 (2 ) P 2 , 3 = 0. 02 7,

P-1 (3) = P-2 (2 ) P -2 ,-1 + P- 1 (2 ) P -1 .-1 + Po (2 ) P o,-1 = 0.1_86,

(3) = P- 2 (2 ) P -2 ,-2 + P- 1 (2 ) P -1 .-2 = 0. 06 0,


P-2
p_ 3 (3) = p_ 2 (2) P _2,_ 3 = 0. 00 8,
) P1 ,o + P o (3 ) P 0, 0 + p_ 1 (3 ) P _ 1, 0 ~ 0.264,
Po (4) = p 1 (3
) P2 ,1 + P1 (3 ) P1 ,1 + P o (3 } Po .1 ~ 0. 25 7,
P1 (4) = P2 (3
+ ) P -1 .-1 + P- 2 (3 ) P -2,-1 ~ 0. 17 2.
P-1 (4} = Po (3) P o.-1 P-1 (3
The re qu ir ed pr ob ab il it y •

p = Po (4) + P I (4) + p_ 1 (4) ~ 0. 69 3.


ev en t A th at af te r fo ur st ep s th e po in t S
Thus th e pr ob ab ili ty of th e 3.
he r th an un it y fr om th e or ig in is 0. 69
will be no fa rt us pr ob le m fin d th e pr ob -
10.22. U nd er th e co nd iti on s of th e pr ev io
ri ng th e fo ur st ep s w as th e po in t S fa rt he r-
ability th at at no ti m e du
than un ity fr om th e or ig in .
Solution. W e de si gn at e as 0 p
(k ), 1 p(k ), p_
1 (k ) th e pr ob ab
po
ili tie
in tS
s-
th at up ti ll th e kt hs te p in cl us iv e th e
ofstatess0,s1,s_H pr ov id ed
wa•s •never fa rt he r th an un it y fr om th e
ongm. 0.5 0.5 0.5
At first gl an ce th is pr oc es s is no t
a Markov ch ai n (s in ce th e pr ob a- 1 0.3 \ i 0.3 \ j
bilities of fu tu re st at es de pe nd on th e
~
~

S-1
~

So s, 1\
"preh"!S•tory" , i.e . on w he th er th e /
r th an 0.2 0.2
point S w as at le as t on ce fa rt he '

unity from th e or ig in ), bu t it ca n
~e reduced to a l\I ar ko v ch ai n by 0.2 s~~ 0.3
Introducing an ot he r st at e, s*, w hi ch
corresponds to th e po in t ha vi ng go ne Fi g. 10.22
farther th an by un it y fr om th e or ig in
at least once. is.
is il lu st ra te d in Fi g. 10 .22. Th er e
The di re ct ed gr ap h of st at es 1\ Ia rk ov ch ai n
0 of th e ot he r st at es ; in
~ passage fr om st at e s* to an y
st at e. Th e pr ob ab il it y of
ab so rb in g
\le or y su ch a st at e is kn ow n as an Y er be fa rt he r th an
ne
t ~ ev en t A = {i n fo ur st op
1 s th e po in t S w ill
at ed as th e su m of pr ob ab ili tie s
~nity from th e or ig in } ca n be ca lc ul
es po nd -
+
fo (4) P- 1 (4) +p 1 (4) fo r a sy st em w
e
it
in
h
Y
a gr
ite
ap
th
h
e
of
re
st
ad
at
er
es co
to
rr
ca lc ul at e
~~g to th at sh ow n in Fi g. 10 .2 2. \V
lese pr ob ab ili tie s .
m un it s,
./0}23. A ~ystem S co ns is ts of a te ch ni ca l de vi ce m ad e up of
t .• • , th ) un de rg oe s pr e-
~enc_ _ from. tim e to ti m e (a t _m
1 om en ts
ea
t
ch
,
1 2
st
,
ep (t he m om en t of m ai n-
nh \e m m nt en an ce an d re pa ir . A ft er
J-i6 Applied Problemt In Pro babllltfl Thror11

tenance and repatr) the system rna} be 1n one of m states (s 0 ) all the
•tntts are sound (no unt ts are teplaced by nc'v ones) (s1) one U[at ts
~eplaced by a ne\V un1t and the other untts are sound (s 2) l\VO un1ts are
replaced h) ne'v one41: and the other un1ts are sound s1 - ~ units
-{t < m) are replaced by ne\v ones and the other un1ts are sound Sm
all m untts are replaced by new ones
The prob ab l11 t y that at the t 1me of pre ven t1 ve rna 1n ten anee a un1t
"tll be replaced by '1 new one 1s p (Independently of the states of the
tOther un1ts) Cons1der1ng the states of system S to be a r.rlarb.ov cha1n
find the trans1 t1onal probablht1es
0216 036 as f and calculate the proba bdt ttes
of system S for m = 3 and
04 1 -- 0 4 after three steps (1n1
ttally all the un1ts are sound)
Soluhon '' e '' rtte the trans1
016 ltonal probabtltties of the chain
destgnattng q = 1 - p For any
0 ()B~ state s, of the system the prob
abtltt} P ,1 1s zero for 1 < i
F1g to 23 the proha bJI l t y P 1, JS equal
to the pro bab1l1 t y that no untts
'vtll have to be replaced h) lle\\ ones at a g1' en step 1 e m - J
'()f the old un1 ts rem a1 n 1n the dev1ce P 1 ,~ = qm ' For t < j the prob
.abll1ty of the tran.s1t1on P 11 1s equal to the probab1ltty that at a g1ven
~tep 1 - i untts out of them ~ ' old un1ts w1ll have to be replaced by
nev, ones U ~ 1ng the b I nomtal d l~trtbu tton \\ e find p 11 = C~..!i]J 1 1q"" J
The state sm lS absorbtng For m = 3 p = 0 4 the dtrected graph of
the states of the "}stem bas the form sho,\n 1n F1g 10 23
0 216 0 432 0 288 0 064
0 0 36 0 48 0 16
UP,I!1- 0
0 06 04
0 0 0 10
I
'\ e have Po (0) = 1 and p 1 (0) = p 1 (0) = Ps (0) = 0 \\ e find the
probab111 ty of states after one t'vo three steps
Po (1) = 0 216 Pt (1) = 0 432
P2 (1) = 0 288 P3 (1) = 0 064
P1 (2) = P1 (1) P11 +Po (1) P 01 ~ 0 249
P2 (2) = P2 (1) P~a -- Pr. (1) :P 02 +p 1 (1) P12 ~ 0 442
+ +
Ps (2) = Pa (1) Pa, P2 (1) P2s Pt (1) PI3 +Po (1) Po3 ~a 262
Po (3j =Po (Z) Poe ~ 0 0!0
P1 (3) = Pt (2) P11 -r- Po (2) I> 01 ~ 0 110
P2 (3) ~ P2 (2) P22 +Po (2) P 0 ~ + p 1 (2) Jl12 ~ 0 398
Ps (3) = P3 (2) P~s +Po (2) P os + Pt (2) Pts + P2 (2) P:za ~0 482
Ch. 10. Flows of Events. Markov Processes 347

10.24. A computer is inspected at moments t 1 , t 2 and t 3 • The pos-


sible states of the computer are: (s 0) it is completel y sound; (s1 ) there
are a negligible number of faults, which do not prevent the computer
from functioning; (s 2) there are a considerab le number of faults, which
make it possible for the computer to solve only a limited range of
problems; and (s3 ) the computer goes down.
ThB transition probabilit y matrix bas the form

Construct the directed graph of states. Find the probabilit ies of the
states of the computer after one, two, and three inspection s, if at the •

beginning (t = 0) the computer was completel y sound.

0./i
,...... O.'t O.J 1

~ ~ 1 I
0.3 0.4 0. 7
So S't s2 83 liol.4 fs;l 3

(6)
0.2 0.2
Fig 10.24 Fig. 10.25

Solution. The graph of states is shown in Fig. 10.24.


Po (1) =Po (0) P 00 = 1 X 0.5 = 0.5,
Pt (1) =Po (0) P 01 = 1 X 0.3 = 0.3,
P2 (1) =Po (0) P 02 = 1 X 0.2 = 0.2,
Po (2) =Po (1) P 00 = 0.25, p 1 (2) =Po (1) Pot+ P 1 (1) Pu = 0.27,
P2 (2) =Po (1) P 02 +
p 1 (1) P12 +
P2 (1) P22 = 0.28,
P3 (2) = P2 (1) P 23 + p 1 (1) P13 = 0.20, Po (3) =Po (2) P 00 = 0.125,
Pt (3) = Po (2) P 01 +
p 1 (2) P11 = 0.183,
P2 (3) =Po (2) P 02 + p 1 (2) P 12 + P2 t2) P2 2 = 0.242,
Pa (3) = p 1 (2) P13 +
p 2 (2) P 23 +
p 3 (2) P 33 = 0.450.

10.25. We consider the operation of a computer. The flow of failures


~malfll:nctions) of an operating computer is an elementar y flow with
mtenstty A.. If the computer fails, the malfunctio n is immediate ly
dt>te~te~ and the computer is repaired. The distributio n of the time of
:ep~tr 1s exponentia l with parameter p.: q> (t) = 1-'-e-J.l.t (t > 0). At the
101
1Jal moment (t = 0) the computer is sound. Find (1) the probabilit y
that at a moment t the computer functions; (2) the probabilit y that
348 tpplted Problems In Probablllty Theory

dur1ng the t1me (0, t) tho computer malfunctions ~t least once (3} the
}Jmttang probnbtltties of the states of the cotnputer
Solut 1on (1) 'fh e states of the sys tern (the computer) are- (s 0 ) 1t ts
~ound and functions, (s1) lt ls f~ul\.y and lS reptilie-d The mar\-.ed graph
of states JS sho'\\ n In F1g 10 25a
The Chapman holm ogoro\ equa t 1on (;1 for t be probabJltttes of stat~
Pa (t) 1nd p 1 (t) ha\ e the form
~~o =JlPI-l.Pot d:/ =Ap0 -JlP1 (10251)
Etthlr of thP~e equ 'l.ttons can he deleted Since for any n1oment t Vt-e ha' e
Po + p 1 - 1 SubstitUting p 1 = 1 - Po 1nto the first equat1on (10 2a I)
"e obta 1n on~ d1 fferent1al equat1on "1th respect to Po
dp 0 /dt = fl - (J.. + ~t) Po
Sol\ 1ng th1s equ1t 10n lor the InJlial cond1tton Po (0) = 1, "c obtain
Po (t) ::- __'!- (1 + 1 e -()..+J.t) 1] ( 10 25 2)
k+ ~\. \1 '
\\hence
(10 2a 3)
-
(2) To ftnd t lH? prob"lbJilt~ p (t) that d ur1ng the t1n1e t the computer
malfunctions 1.t leaqt once "e Introduce ne'' ctates for the s~ stem
- -
(sO:) the computer ne'er failed (s1 ) the compute-r malfuncttoned at lea~t
,....,
once The state s 1 Is the ab~orh•ng one (~ce F1g 10 25b)
"""
Sol' 1ng the Chapm1.n holmogorov equat1on dp 0 /dt = _, p 9
for the
"" ,..,
1ntt1al cond1t1on Po (0) ~ 1 ''e get p 0 (t) :;;;;:
~

e-'
t, ''hence p 1 (t) =
-
1 - Po (t) = 1 - e '-t Thus the probabtllly th1.t durtng the t1me t
the computer malfunct1ons at ]cast once 1s 1 (t) = 1 - e p 1n th1~ '1
case the proba btltt y cuu Id La' e been calculated more qim pl} 1 e a.s the
probabtl1ty that durtng the ttme t at least one e' ent (malfunctton)
'" tll occur 1n 1.11 elemen tar} flo'' of malfunctiOns '' 1th Inten~tty 'A
(3) As t -+- co "e get from cquattons (10 25 2) and (10 25 3) the
l1mtt1ng probabllxt1es of states Po = ~t/(J._ + ~t) p 1 :::=::: J../('A ~t) wbtch +
could e \ en t uall l have been obtained d1rccti} from the graph of states
ustng the btrth and death proce~s (v. e 1nv1te the reader to do th1s)
10 2G On the hypothesis of the precedtng problem the malfunctioning
of the computer IS not lmmedtately not1ced but IS detected after an
Interval of time ''htch has an exponent1al d 1.str1but1on w1th parameter v
\Vrtte and solve the Chapman h.olmogorov equations for tho probabdt
t1es of states Ftnd the !Imlttng probabilities of states (from the dtrected
graph of states)
Solut1on The states of the system are (s 0) the computer IS sound
and operates (s1) the computer 1s faul ll but the ma]functton IS not
detected, (sz) the computer 1S he1ng repa1red The graph of states ~~
shown tn Fig 10 26
Ch. 10. Flows of Euents. llfarkou Processes 349

The Chapman-Kolmogorov equations for the probabilities of states


.are
~
dpo
dt = ~Pz- "'Po•
• dpl
dt =[\,Po- v P1' dp 2
dt =
"
v Pt- ~tP2· ;;(10 · 26 · 1)
We shall solve this system using Laplace transforms. With due regard
for the initial conditions for trans-
forms :n:i of probabilities Ph equa-
tions (10.26.1) assume the form
s:n: 0 = 11n 2 - l.n 0 + 1,
s:n: 1 = 'An 0 - vrt 1 ,
SJt2 = VJtl - ~ll12.
Fig. 10.26
.Solving this system of algebraic

equations, we get the following equations for the transforms:
'A. v 'V 'A.
~ ~ n;- n;- n
••t= s+v ••o• 2- s+ll 1- (s+v)(s+~l) o•
(s+ v) (s+ !!)
no= s(s2+s(f.t+v+t..>+vt..+vll+l.~t)'
\i'e introduce the designations
~~~~-----------
a=!l+~+'· + 1/ (!!+:+1.) 2
-VA-V!-L-/.!1,

b = - ~t+~ +I. -l/(fl.+~+t.)z -v'A-v~ -All.


Then the expressions for probabilities assume the form

Po ( t) =
aeat_bebt
a-b + (v+ ~)
eat_ebt [ 1
a-b +!-LV ab +
beat-aebt
ab (a-b)
J'
Pdf)=l.
8 at_ 9 bt
a-b +'All
[ 1
ab +
beat-aebt
ab(a-b)
J '
1
p.'!.(t)=VA [ ab + beat-aebt
ab(a-b)
J•
1o find the limiting probabilities, we can use either the transforms
()r the probabilities themselves:
p 0 =lim p 0 (t) =lim sn 0 (s)
t-eo s ..... O
= t.. ~t+~~~'+vtt• ,

Pl = /,~t - 'J.v - 1-
(10 26 2) -
I.!J.+I.v+v~t ' P2- Po Pt- lt..ll+lvv+''!l • . .
\re can find the final probabilities of states directly from directed
graph in Fig. 10.26: MP 2 = l.p 0 , l.p 0 = vpi, vpl = !1P 2 • We can delete
Qne of these equations (say, the last one). Expressing p 2 in terms of
Pr and P1 , i.e. p 2 = 1 - p 0 -PI, we get two equations
f.l (1 - Po - PI) = l.p 0 and l.po = vpp
The solution of these equations yields the same results (10.26.2).
3:s0 tpptted Probltmr tn Prabablltty Theory

10 27 An electronic devu:e conslsls of t''o 1dent1cal repeat1ng untts


For the dev1ce to operate It JS suffie1ent that at least one unit functtons
\Vben one of the un1ts fa1ls, the device continues to funct1on normally
due to the other unit The flow of fatlures of each unit is elementary
w1th parameter A \Vhen a un1t fa1Is, 1t ts Immediately repatred The
t1me needed to repair the untt 1s exponent tal 'v1th parameter Jl lntt1ally
(t = 0) both un1ts operate Ftnd the following characteriStlc.s of the
operatton of the device
(1) The probab1ltttes of sta.tes (as functions of ttme) (s 0 ) both untts
are sound, (s1) one un1t 1s sound and the other ta be1ng repa1red, (s1) both
untts are he1ng repaired (the device does not operate)

.......
2.l ...
.
s,
,...,; ~
~

So
~

-.... Sz
(h)
ODes not Does not
Operates cp~~ate Operates operate
,,-----..JA,.___--..
\ ~------ow,

(c)
Fig 10 27

lfllw

(2) The probability p (t) that d urtng t1me t the de' 1te ne'er falls
(3) The l1m1 t 1n g proba b 111 ties of states of the de' tee
(4) For the limiting (stationary) condtt1ons of the devtce the average
rel a t1 ve t1 me for w h1ch the devtce 'viii operate
(5) For the same llmtting cond1t1ons the a' erage time ~P of £allure-
free operatton of the devtce (from the moment 1t 1S s'v1tcbed on after
hetng repaired t1ll the next malfunctton)
Solution The dtrected graph of states of the dev1ce ts shown lD
Ftg 10 21a (2A. 1s put before the upper Ieft arrow s1nce o units operate- t''
and may fatl) for the same rea son 2~.t. Is put before the lower right
arro\v stnce both un1ts are he1ng repaired)
(1) The Chapman h.olmogorov equat1ons ha\e the form

d:t = ~tp 1 - 2Ap0 dft 1


- 2Ap 0 +211P:L-(A+~t) Pt•
(10 27 1}

ln thts ca~e the following condttton must be fulfilled p~ p1 pj = 1 + +


Solving this S}Stem of equat1ons for the Initial conditions Po (0) = 1,.
Pt (0) -- P2 (0) = 0 ''e obtaJn
erl I -e~t .. [ t beat_ aebl ]
ae at :t.. 9 bt
Po( t)= '"',. --~q +(A...L.3n)
r «+b + 2f~. ; eb
+~-~
4b (a-b} '
Ch. 10. Flows of Events. 'Markov Processes 351\

where
a= -(/•+f.l), b= -2 (l..+f-t),
a-b=A.+f-l, ab=2(l..+f-t)2,
_ a2eat_b2ebt (i..+SJ.l) (aeat-bebt) 2J.l (eat-ebt) 2i.. ()
Pt ( t)- J.l(a-b) + J.l(a-b) + a-b + J.1 Po t.
From the expressions obtained we get
p 2 (t) = 1 - Po (t} - p 1 (t} and p 2 (0) = 0.
"'
(2) To find the probability p,...., (t}, we make the state_s2 (the device
stopped operating) absorbing (s 2 ) (Fig. 10.27b). For the probabilities.
of the states the Chapman-Kolmogorov equations are
N ,..., ,_..., -., AJ - ,.._, ,....,

dpofdt = ftp 1 - 21..p 0 , dp 1 /dt = 21..p 0 - (!.. + J.L) p 1, dp 2 /dt ,...,= l..pl' .~.

Solving the first two equations under the initial conditions Po (0) =
""
1, P1 (0) = 0, 've obtain
,..., ae"t _ ~ 6 1lt 6 cd _ 6 1lt
Po(t)= a-~ +(I..+J.L) a-~ '
where
-(Sf..+~~)+ y 1..2 + 6I..J.1+ J.1z
2 '
(the quantities a: and ~ are negative for any positive values of 1.. and J.L)~
Furthermore,

""" 1 a'J; 21. """ 1 [ azettt- ~zellt


pt(t) = J.1 a/ + ~~ Po= J.1 a-~

+<t..+ J.L) ett


ae a=Be
R (3t J+ r."' 2' ,...,
Po (t).

~he required probabilities p(t) = p (t) + p (t). Note that the function
0 1
P (t) is monotone decreasing, and p (0) = 1, p ( oo) = 0.
p· (3) The final probabilities of states can be found from the graph in,
lg. 10.27a and the general formulas (10.0.23) for the birth and death,
process
21. 21.2 ( I. ) 2 i
Pt = ~~ Po; Pz = 2 ~12 Po= ~~ Po• Po+ Pt + P2 = '
Po= r1 +2A.ht + (A.ht)2r1 = J.LZ/(1..+ ~t)2.
(4) The average relative time for which the device will operate is
1_ " =- 1 _ ( /, ) 2 ( ~L ) 2 = 1 _ ( . i. ) 2
p_ Jl t..+~t A+p .
f (5) The variable t:,P is the expectation of time Top which passes
Ir~llll
ar ure.
the moment the device is switched on to the moment of its next
~5- A p p hed Pro bl ~ m • ln Pro ba btltt ll T heorv

Let uq. corunder the operat1on of the device 1n stationary condtttoru


to consist of a number of cycles • opera test' and "does not opera ten (see
F1g 10 27c, '\here the operat1ng sect1ons are shO\\ n 1n bold ltne) The
.a" erage t1me tDonop for \Vh1cll the devtce does not operate (the expecta
t1on of the length of a nonoperattng period) 1s evtdentl} 1/(2~) (since
a flo'V of trD.flSltiOnS to an operatang Stato Wtth InteOSllY 2lt aCtS on the
de' 1ce \Vh 1ch lS In the no no pera ttng state).
Furthermore the ratto of the average t1me of trouble free opera.tton
- -
t 0 p to th d t of ""nonopera lion, tnono p lS equal to the rat 1o of the final

.- ~

6u
~

.. ~
s,
#01
~

Sz

(6)
1 ig to 2S
proLabliJ t~ 1 - p 2 (that the dev1ce ts opcrattng) to the probablltty p1
(that 1t does not operate), 1 e. t:r/~onol.l = (l - p 2 )/p 2 • From th1s,
bearing 1n mtnd that ~onoP = 1 /(21L), \\e get
- - i ~ Pt 1 i - Pt
t(!p = ti)On()P =............._--:;,....;~
p., 2!-L Pl
10 28 The condtttons and quest1ons are the same as 1n Problem 10 27
e1.cept that as long as one of the un1 ts operates, the other 1s hept 1n
reserve and so cannot fa11 '''hen tbe ce~e.rve unlt IS sn .1tcbed an tt
beg1ns operating at once and IS subject to a flo,v of fatlures 'v1th 1nten..,
Slty A
So1utton The d1rected graph of states of the de\ 1ce has the form
shown 1n F1g 10 28a and the graph '' 1t h an 1. bqo rbtng state 10
F1g 10 28b.
Answers
(i} P 0 (t) = 'ae~1-be~t +(A+ 3 L) e~tt -ebt +2 2[ J + beal-ofibt J,
a- b ~ a -b P.. ab ab (a -b)
p ( t} = a'ea t - b!c~t + (A+ 3r.) (ae(Lt _ beM)
1
f~ (a-b)i 11 (a-b)
-+ a~b (eal_elliJ+ ~ Po(t), p~(t)=1-Po(t)-pt(t),
a= - (2"'+3ft) + lr4l-t£+ f.li=w b - (2X+3J-tl +2 V,. 4JJA...l- ~' 1

2 ' =
,_ a:ea t _ ~e~f eat -et1t
{2) p 0 (t)= $!-..::;. -+{X+ .~) <Z-·{1 t

-
Pt(t)
t [
= l'- a'2e~t
-~ e
Q~ fH
+(A.+ ) ae
o:.t A
-..,e ~t J+-A" Po(t). lfiiW

a.---13 P- a-~ p
,...,. ...... ,..., -(i..2+ ~L)+ V 4)_u,+ JL1
P'J (t} = 1- Po (I)- Pt (t), a=- 2 ,
Ch. 10. Flows of Events. Markov Processes 353

,...,
A- -(1..2+~-L)- v'" 4t..~-t+ 1-12 p(t)=1-p2(t),
p- 2 '

(3) Pi=-~ Po• P2 = { ( ~: )


2
Po, Po= [ 1 + A.
IL
1 ( ~ )
+2 IL
2]- i

- 1 1-p2
(4) 1- p2; (5) tp = 2~t P2 .

10.29. The conditions of Problem 10.27 are changed so that the


nonoperating unit is in a reduced reserve and may still fail with inten-
Rity 'A' < 'A. (1) Construct the directed
graph of states of the device and write
out the Chapman-Kolmogor ov equations
for the probabilities of states. (2) With-
out solving the equations, find the lim-
iting probabilities of states; calculate Fig. 10.29 •
tl1em for '}. = 1, 'A' = 0.5 and ~t = 2.
(3) Using these numerical data, find the average time t 0 p of the
trouble-free oper::~tion of the device.
Solution. (1) The directed graph of states is shown in Fig. 10.29 and
the Chapman-Kolmogoro v equations are
~~o = -(/.. + lv') Po+ /-lPh ~1 = - (lv+ 1-l) P1 + (!v+ lv') Po+ 2~-tP2
Po+ P1 + P2 = L
(2) The limiting probabilities of the states can be found from the
general formulas (10.0.23) for the birth and death process:
PJ""' f.+t..' p
~ o• P2 =
(I..+!..') f..
2 ~t2 Po•
p = {1
o
+ t.+A.'
f.l
+(A.+A.')A.}-1
2!-tz •
Substituting here 'A = 1, 'A' = 0.5, and ~t = 2, we get

Po= { 1 + \_ + 1.s;
5 1
} -i ~ 0.516, Pi~ 0.387, p2 ~ 0.097.
(3) -t Jl = 1-
4
1
""""""' 2 • 32 •
- Pz """"""'
Pz
1~.30. A computer includes four disk storage units. A team of four
cnmes out a preventive inspection on each disk. The overall flow of
~vents! i.e. moments when the inspections are finished by the ·whole
t.m, Is a Poisson flow with intensity A, (t). When the inspection is
Tshed,_a di_sk is checked and may prove seniceable with probability p
(1 le validation time is small as compared to the time of inspection
~nd can be neglected). If a disk proves nonserviceable, it is a<Yain sub-
Jected to inspection (the time needed for this operation does n;t depend
fn ~vh~tlwr an inspection was carried out before) and so on. At the
Je~znn1ng all the disk units need preYent iYe inspection.
a ~ons~ruct a directed graph of states for the system S (four disk units)
'nc '\l'J!C' the differential equations for the probabilities of states. Find
~3-osis
354 Appl~ed Problemt tn ProbGbfllty Thtorv

\I.u the expectat1on of the nun1her of dtsks 'vh1ch successfully pa~s the
1ns pee t1on by the moment 'l'
Solution (s 0) all four dtsh. units need pre\ ent1 \ e tnspectton, (s1) one
dtsk un1t succcs~fully p1.s'=ed the 1nspectton and three d1sk unrts need
repa1r, (s 2)t''o d1sk un1 ts succe,sfully p1.~sed the Inspection and t~ o
need reprur (s,.) three rl1sk unll~ .succc~~fully pas~ed the JnspectJon
and one needs repatr t (s 4) nll four un1 ts succe<;'=full} passed the 1nspec
tlOD
The f1ct that each matntenanco JS successful "1th probabtl1ty p 1S
equtvalent to a p transforntat1on of tho flo'\ of Inspection completions,.
after '\h1ch the flo,\ ren1a1ns a Pots<;on 1lo\v, but '' 1th 1ntens1t) pA (t}

I So ~ p).(t)
"311&
r S1 ~ .,p).(t}
... • !II' r & Ipl{t) I Ipl{t) [ s I
2 .. SJ .- tt-

rig to 3o

The directed graph of ~tates 1s "hO\\'n 1n F1g 10 30, and1 the Chap
man 1\.olmogoro\ equatton,;_; are
dpr/dt = -p} (t) Po dp 1 1dt = p~ (t) (po - p 1 )t
dp 2 /dt = pi (t) (p1 - p 2 ) dp 3/dt = p~ (t) (p 2 - p 3),
dp 4 /dt = p'A (t) Ps (10 30 1)
The lnitlal cond1t1ons are Pu (0) = 1 and p 1 (0) = ~ P4 (0)''= 0

pJ.{t) J 1 l(J}
i 1 (;f)
JO - Sr
7ipJ.{!)
St
2P .
~J
7ipl
Sot;

rtg to 31

The e"pectat1on of the number of d1sks \vh1ch successfully pa~sed


the 1nspect1on by the ttme 1: 1s
4
'I 'f = ~ 1p 1 ( 't) ( 1 0 30 2)
t~l

At a constant 1ntens1ty A the solutions of equations (10 30 1) are


Po (t) = e-~pt, Pt (t) ::::>: 'A.pte-,.Pt, p'l {t) = (A.p;J• e-,.pt,

(~pt~3 ~
P3 (t) = 31
e-lpt1 p, (t) = 1- LJ Pi ( t).
t~o

J0.31. On the hypothesis of the precedtng problem eaeh member of


the team 15 asstgned one dtsk Unit '' hlch he must maintain The no"
of the lnspectton completions per team member has 1ntens1ty A (t)/4
Ans\\cr the questions 1n the preceding problem.
Ch . 10. Flo ws of Ev en ts. Ma rko v Pr oce sse s 355

Th e di re cte d gr ap h of sta tes is sh ow n in Fi g. 10 .31 an d th e


Solution.
Chapman-Kolmogorov eq ua tio ns are
dp /dt = -p 'A (t) p 0 , dp 1 /dt = p'A (t) (p 0 - (3/4) p 1 ),
0

dp 2 /dt = p'A (t) ((3/4) P1 - (1/2) P2),


dp 3 /dt = p'A (t) ((112) p 2 - (1 /4) p 3),
dp 4/dt = (1/4 ) p'A (t) Ps•
0.2 ) fo r M , re ma in s th e sa me fo r th is sit ua tio n.
Expression (1 0.3
is su bj ec ted to an ele me nt ar y flo w of fa ul ts wi th
10.32. A de vic e
'A . A fa ul t is no t de tec ted at on ce bu t on ly af ter a ra nd om
intensity
time int erv al wh ich ha s an
exponential di str ib ut io n wi th
parameter v. As so on as a fa ul t •

is detected, the de vic e is in sp ec t- Sz


ed and is eit he r re pa ire d (w ith
probability p) or re jec ted an d (1-p)J'
replaced by a ne w on e. Th e
time needed for th e in sp ec tio n
is exponential wi th pa ra me ter
Y• the tim e ne ed ed fo r re pa ir Fig . 10.32
IS ex po ne nti al wi th pa ra me ter
ft, the tim e ne ed ed fo r re pl ac in g me ter x.
e by a ne w on e is ex po ne nt ial wi th pa ra
tho rejected de vic
the lim iti ng pr ob ab ili tie s of th e sta tes of th e de vic e?
\\:hat are rm all y;
ge tim e fo r wh ich th e de vic e wi ll op er ate no
Fmd (1) the av era
for wh ieh th e de vic e wi ll op er ate wi th an un de tec ted fa ilu re
(2) the tim e e de vic e
e av era ge co st of re pa iri ng or re pl ac in g th
(produce rej ec t); (3) th vic e
th e av era ge co st of re pa ir is r an d th at of a ne w de
per un it tim e if
15 c; (4) the av era ge los s pe r un it tim e wh en th e de vic e op er ate s wi th
ref us al if th e op er ati on of su ch a de vic e en tai ls a los s l pe r
an undetected
unit tim e.
Solution. Th e sta tes of th e de vic e are ~ (s 0 ) th e de vic e is so un d an d
e de vic e is fa ul ty bu t a fa ilu re is no t de tec ted ,
operates no rm all y; (s1) th vic e
an d th e de
!h~ device pr od uc es rej ec ts; (s 2 ) th e re fu sa l is de tec ted
re pl ac ed . Th e
(s th e de vic e is
~s.mspected; 3 (s ) th e de vic e is re pa ire d; 4 )

lrected gr ap h of sta tes is sh ow n in Fi g. 10 .32 .


eb rai c lin ea r qu at ion s fo r th e fin al pr ob ab ili tie s of sta tes ar e
The alg
'Apn = !!Ps + xp4, f•Po = vp l, "P 1 = "?Pz, PYPz = ~tp3,
(1 - p) yp 2 = xp,,.
Th
th e no rm ali zin g con d1 .t1 'on 1's p o+ p +
1
p+2
p 3
...L p '• -- 1 • Sol\rl· ng
,
ese eq ua tio ns , we ge t
'
'
'
Po =[ 1 + ~+ "' + pl . _.L (1 -p ) i.
y !- 1' r.
J-1 ,
I. pl. (1 - p) 1.
Pt = V
p O• P3 = u Po• Pt .= x Po·

35S Applt~d Probl~ms fn Probabllity Theory

(1) The a' era go time of norma.l operation of the dev1ce 1s Po (2) p 1
(3) For an a\ erage fractton of ttme p 3 the dev1ce ts being rcpatred
every time the ropatr lasts for 1/}1 on the a\ erage, the flo'v of repaired
devices,. com1ng out of tho state s 3 has 1ntens1ty JlPa the average cost
a! repairs per un1 t t1me lS rJ~p 3 SJmllarJJ !be average cost of nelv de
'1ccs per untt t1mc ts cxp 4 the total a' erage cost of both 1s TJ1p 3 + cxp,
(4) Tho a' erage los~cs 1. f1ul t: de\ ICC entatls per un1t t1me are l"Pt
110 33 ' re
no'\ con~tder ho'\ records are stored 1n n data bac:::e of a com
puler Tho tntcnsi t} of the records to he Included 1n the bases IS A(t) and
does not depend on ho'\ man~ records 11 'l.\ e been stored Every record 1s
stored 1n the data bace for an Infin1tel} long period of ttme Ftnd the
chnra ct er1s tics m~ ( t) nn d V ar ~ (t) of t h~ r1. ndom funct 1on X (t),. "h1ch

l I So
........__.......
r
J. t) [
ilo
l
s ~· r,)· ~·I s2 IJ. rt1
)lr • • •
.J rt )l
, - sk J.4 rl)
Jl •••
F1g fO 33
IS the number of records .stored 1n tho data base prov1ded that the 1n
comrng records a.rr1 ve In a Poisson process '\ Jth Intensity A, (t). and at
t = 0 the random function X (0) ~ 0
Solution In th1~ problem \\ e deal 'v1th a process of 'pure btrth
'vlthout any re~trlcltons he1ng tmpo~ed on the number of states (n -+ oo}
All the birth rateos A~:t = iw (t) (F1g 10 33) and the death rates ~L~t =
Jlk (t) ~ 0
For thts case dtfferential equat1ons (10 0 24) and (10 0 25) a<=sume
the form
dmx (t)
dt ~ ~ A{tJpk(t)-=A(t)t
h-0

d v~~:r w - ~ [A ( t) + 21.1. (t)- 2m.E (t) I. (t)l Ph (t} =A (t),


h=O
Cl.tnce
CIO OQ

LJ Pk (t) = 1 ~ kpk (t) = m.l. (t)


h-0 h-0
Solving these equations for the Inltial cond1ttons m::r (0) = Varx (0) =
0 ""e obtain
t

m"' (t) = Var"' (t) = ) i.. (1:) dt


0

Tlus result ~hould ha\ e been e~pected s1nce 1t can be found directly
from the theory of flo\\S \Ve can prove that for any moment t the ran
dom variable X (t) has u Po1sQ.on distribution '' Ith the characteristiCS
mx (t) = Varx (t) obl ,_•ned
'

! Ch. 10. Flows of Events. JJiarkov Processes 357

10.34. The conditions are the same as in the preceding problem except
that a record is stored in the data base for some time after which it is
deleted according to some criterion. The flow of deletions for each
record is a Poisson process with intensity ~L (t).
Solution. In this problem we have a birth and death chain for the
number of records in the data base. The death rate 1-tk (t) = kj.t (t)
since k records are stored in the data base in the state sk and each record
is acted upon by a flow of deletions with intensity ~L (t).
The differential equations for the functions mx (t) and Varx (t) have
the form
00

d~;(t) =~ (A.(t)--k~-t(t))pdt)=A.(t)-~-t(t)m.'C(t), (10.34.1)


k=O
00

dV~;(t) = ~ [A. (t) + k~L (t) + 2kA. (t)- 2k 2 ~t (t) -2mx (t) A. (t)
k=O
t2mx (t) k~t (t)] p 11 (t) =A. (t} + ~t (t) mx (t)- 2j.t (t) Varx (t), (10.34.2)

smce
00

2J
k=O
kPk (t) = mx (t),
00

2.j k 2 PR. (t) = cx 2x


k=O
(t) = Varx (t) + mi (t).
For the initial conditions mx (0) = m 0 and Varx (0) = Var 0 ; for
constant intensities for the addition and deletion of the records, i.e.
i. (t) = 7., and J.t (t) = J.t, the solutions of these equations have the form
mx(t)=m 0 e-~tt+ A. (1-e-~tt)= A. +e-~tt ( m0 - A.) ,
. ~ ~ ~


\ ·arx(t)=A. [ 1
+e -2).tt
-
2
e
-~It J+(A.+J.t Varo + flmo) e
-j.tt
-e
-2).tt
~ ~ ~

+Var 0 (2e-2~tt- e-JJ.t) = mx (t) + (Var0 - m 0 ) e- 2J.tt.


i. For the initial conditions m 0 =
Var 0 = 0, we get mx (t) = Varx (t)=
ii(1-e-~tt). We can prove that for these initial conditions the process
of storing information X (t) has a Poisson distribution for any moment t
and for any kind of function A. (t) (the intensity of the addition of a rec-
ord), hut for this to take place the deletion intensity fl for the records
must he constant.
For constant A. and ~l and t ~ ex>, stationary conditions of information
stordage ·will be established, which naturally do not depend on the initial
con itions: mx = Varx = A./~t.
Tht0.35, We consider the process of manufacturing a type of computer.
r· e manufacturing intensity of the computer '). (t) is shown in
Jg.10.35a. It increases linearly during the first year from 0 to 1000 com-
358 Appl ed Probltms in Probability Theory

puters a ) car, for three l ears the productton sta~ s at 1000 computers
a year" and then t be com put~r IS tah.en out of prod uctto n The average
ser' ICC l1fo of a computer 1S fi.\ e years Consrdertng all the flov.-s cf
e\ ents to be elemeat~ry, find the mean l alue and '8t1a.nce of the num
ber of computer~ 1 n "'et\ rce 1. t any moment t

J.(t)
1000
I
t
f

I I J It t
(a)

••• ~ . . . . sk ~-
p11 {t) ....___, Jll! I (f)
(h)

m 350(]
- - , 1 ' 1 - - - -......._ _ _ _ _
JSOO
JOOO
2500
I
2{10(}
1500 I
1000 920
-iS~
500

f 5 S 7 8 9 f0 t years
{C)

Solutton The manufacturing 1ntensity for the computer


0 for t<O,
kt for O~t< 1,
iw(t) =
). for 1~t < 4,
0 for t~4t
\\here k = 1000 1/Jcar2 :A = 1000 1/year
Let us find the solutions of equat1ons (10 34 f) and (10 34 2} for the
ttme Interval 0 ~ t < 1 under the condttion that ~t = 0 2 1/year for
any ttme Intervals t > 0 Under these condtttons cquatton (10 34 1)
has the form
dmd"/t) = kt- !J.m:l' {t)
Ch. 10. Flows of Events. Markov Processes 359

Solving this equation for the initial condition mx (0) = 0, we obtain

mx(t)= : 2 (e-~tt-1+f.Lt)=25000 (e-t/5-1+{).

In a year's time an average of mx (1) = 25 000 (e- 1 / 5 - 1 + 0.2) =


468 computers will be in service. Note that if the service life of the
computers were infinite, then up to 500 computers would be in service
by the end of a year.
Under the same conditions equation (10.34.2) has the form
rlVarx (t)/dt = 1000t + 0.2mx (t) - 0.4Varx (t).
Solving this equation for the initial condition Varx (0) = 0, we get
Var, (t) = mx (t) = 25 000 (e-1 15 - 1 + t/5). In a year the variance
of the number of computers in service will be Varx = 468, <Jx = 21.6.
The number of computers in service in a year's time will be approxi-
mately normally distributed with characteristics mx = 468, <Jx = 21.6.
On the time interval 1 ~ t < 4 the corresponding equations will
have the form

dm~/t) =A.- f.Llnx (t}, d va;; (t) =A.+ !lmx(t)- 2!1 Varx (t).

We must solve them for the initial conditions mx. (1) = Varx (1) = 468.
We found a solution of these equations in Problem 10.34, whence we
haYe
mx(t)=mx(1)e-~t<t-1)+ ~ (1-e-~t(t-1)) (1~t<4),
Varx (t) = mx (t).
Let us find the value of mx. (t) for t = 4:

mx(4)=mx(1)e-311 + ~ (1-e-311)=2513.
f Thus the average number of computers in service by the end of the
~urth year is 2513. Pay attention to the fact that an average of
3vOO computers were manufactured by that time. Consequently, an
aYerage of 987 computers were taken out of service during the four years.
A process of "pure death" whose graph is shown in Fig. 10.35b will
take place on the time interval t > 4.
In a general case the differential equations for the mean value and
Yariance of the pure death process have the form
00

dm;/t) = - ~ !lk (t) PI! (t),


k=O
360 Appl td Problems Irs Probability Theorfl

For our ca~e llh (t) = k~t and consequently~ we get an equat1on

dmd/t) = - ~ kJiPA (t) = - f.tm% (t)


A==O
wh1ch \\e must sol\e for the lDlltal cond1t1on mx (4) = 2513 The
solut1on of th1s equatton has the form
m~ (t) = m'f (4) e-P-\f 4) (t > 4}
Stnco Varx (4) - mx (4) and ~t = const ''e have Varx (t) = mx (t)
on the ttme Inter,al t > 4
F1gure 10 35c sho" s a relationship bet" een mx (t) 1 e the average
number of computers 1n ~erv1cc and t1me t The dash l1ne sho,\S a graph
of the a\erage number of computers manufactured by a certatn ttme
t "esus t1me t
10 36 \Ve no\\ cons1 der the pro cec:s of stor1 ng terms 1n a dtrcctory
tn a data bank The proce sIS to 1nclude the terms 1n the dtrectory ,v] en

I sl7 r~o t}.. l~ l !! .....


J All ~ J";t /itA .. ~.. ~·{. .S.~n
. ..._,tn q-s"-1
Fag 10 36

they first appear For example the names of organtznttons for \\'htcb
a factory has productJon orders are entered Into the data bank of the
management 1nformatton S}Stem The entr1es for the names of the
organisations Will be accumulated 1n the data banl. of the mandgement
Information system 1n the order In "h1ch they nppear 1n the Input
records
There are an a' erage of x terms In the dtrectory 1n every record fed
1nto the data banh. and the 1ntcns1 ty of feed1ng the .records Into the
data banl IS A, (t) Consequently the 1ntens1ty of the flo'v of terms fed
~

Into the data bank IS A (t} = XA (t) \Ve nssume that tbts IS a Potsson
flo\v The number of terms n IS fintte and nonrandom although 1t may
not be kno\vn beforehand All the terms 1n the dtrectory may appear
In a record 'v1th equal prob'lblltty 'vh1le naturally onl} those terms
are fed 1nto the dtrectory that d1d not yet appear 1n the records F1nd
the mean value and var1ance of the number of terms 1n the dtrectory
Solution \Ve designate the number of terms In the dtrectory as X (t)
The random process X (t) lS ev1dently a pure b1rth process w..tth ~ Jin1te
number of states n 'vhose graph of states 1s sho,vn 1n Fig 10 36 To
find the 1ntens1ty AJt (t) (k ~ 0 1 n - 1) we assume that the
-pt.ol'!e~~ \S, 1.n. the ~ta.IJ,\ -tR B."1 dQ.n-n~..t.)..,Q.TJ. tn~ 1\IQhab.1..t~...t 'J of th.ts. ass u m"Q
tton lS Pk (t) Provided that the ac:sumpt1on IS sat1sf1ed the 1ntens1ty
of the flow of new (not yet 1n the d1rectory) terms 1s

Adt) ="- (t) n:-k =A(t) ( 1- ! )


n

ts . M a rk o v P ro ce ss es 361
Ch. '10. F lo w s o f E v e

d (1 0 .0 .2 5 ) a s s u m e th e fo rm
n ti a l e q u a ti o n s (1 0 .0 .2 4 ) an
The d if fe re
n .3 6 .' 1)
d m x( t) ="" Ll
'A(t) (1- kn )
P k ( t) = 'A ( t) - m x (t )A (t )' (1 0
n
dt
lt= O

d Va~; (t ) = ~
n
['A (t ) ( 1 - ! )+ 2k'A (t) ( 1 - ~ )
lt = O

- 2 m x (t ) 'A (t ) (1- ~ )JpJ.(t)


2'}.. (t ) V a :x (t \. (1 0 .3 6 .2 )
= 'A ( t) - 'A (t) mxn(t) _

s fo r th e s im p le c a se w hen
Let us solve th e s e e q u a ti o n = V a rx (0 ) = 0 .
n = c o n s t, m x (0 )
'A (t) = 'A = c o n s t,
A
--t
li m mx (t ) = n, (1 0 .3 6 .3 )
mx (t) = n ( 1 - e a ),
t- o o
A A
A
--t - - t --t
= m x ( t) e n ,
Varx(t)=n(1-e n )e n (1 0 .3 6 .4 )
li m V a rx (t ) = 0 .
f...,.oo n
mon o to n ic a ll y , te n d in g to
n c ti o n m x (t) in c re a s e s
. Note th a t th e fu n V a rx (t ) is z e ro fo r t = 0 a n d t - + oo
e re a s th e fu n c ti o
m the li m it , w h rt a in v a lu e o f tm w h ic h c a n b e fo und
it s m a x im u m fo r a c e
and a tt a in s t 0 (t > 0 ).
from the c o n d it io n d V a r x (t )/ d =
Hence
--t
0 .5 = e n m " ) tm-;::::;0.1n "' ·
;~

c e m a x V a rx (t ) -;:::::;
u m v a ri a n
For this v a lu e of
1l ( 1 - e-0.7) e-0 .7 =
tm
n 0
th
.2 5
e
,
m
O 'x
a x
(t
im
m ) = 0 .5 vn: m )
a n
=
d th
uvn:
e m a x im u m

a ri a ti o n O 'x (t m )h n x (t
value of th e c o e ff ic ie n t o f v th e fl o w o f te rm s to th e d ir e c to ry in th e
te n si ty ') .. , o f n ,
If we know th e in a n k a n d th e to ta l n u m b e r o f te rm s
g a t th e d a ta b me
records a rr iv in s u ff ic ie n t a c c u ra c y , th e a v e ra g e ti
d e te rm in e , w it h a i.
then we c a n
c to ry , i. e . 1 - e -n -t rm =
ll 9 5 p e r c e n t o f th e d ir e
trlll_ necessary to fi
O.9;), whence tf in -;:::::; 3ni'A. ra c ti c e ), w e c a n
a fr e q u e n t o c c u rr e n c e in p
If n is u n k n o w n (w h ic h is rm in e th e a c tu a l
n ti ty n a s fo ll o w s. W e d e te
a te n o f th e q u a . . , m
find the e s ti m rm s in th e d ir e c to ry m 1 , m 2, . 1
ber s o f th e a c c u m u la te d te m e th e se
, • • • , -r 1 (-ci <
n u m -c i+ ). 'V e a s s u
at ea.cl~ m o m e n t -c , -c , -c 3
1
u a n ti ti e s o f a c c u -
1 2
te ly e q u a l to th e av e ra g e q
quantities to b e a p p ro x im a lv in g
(i = 1 , 2 , . . . , l) . S o
_ e -r i/ n )
~~lated te rm s: n (1
m i = -A
. . . , n fo r th e c o rr e s p o nd-
n d l v a lu e s n , nn, 1
. <>us e q u• a ti o n fo r n , w e fi 1 ~
'V e e s ti m a te
-c ), (m 2 , -c 2 ), • • • , (m 1, -c 1).
Ill
-" pans o f v a lu e s : (m 1 , 1
11 om th e fo rm u la
fr
362 Applr.ed Problem1 ln Probabtlltu Thtory

10 3 7• r or the cond 1tlODS Of the proced 1ng problem find the ttme f.rnt
needed to fill the dtrectory by 95 per cent and the probabtllty that after
t" o years of accumulat1on the dtrectory '' lll contain less than 90 per
cent of all the posstble terms 1f the total number of terms n = 100 000
a tot 11 of 100 000 records are fed 1nt o the data bank per year and each
document conta1ns an average of 1 5 terms
Solutaon \V' e find the 1n t ens1 t y of flo'\ of tetms per record f~d Into
the data bank per annum
A = 100 000 x 1 5 = 150 000 1/}ear
\\ e can find the quant1 ty ttJn from the expression tttu = 3 n!i =
1 100 000/110 000 = 2 1 ears To find the probabtl1ty that after
t" o years of a.ccu mula tt ng terms the dtrect ory '' tll con ta 1n no less
than 90 per cent of 1ll terms, 1t 1s first of all necessar} to find mx- (2)
and Var:r (2) [ ~ee formulas (10 36 3) and (1 0 o6 4)1 ml: (2) =
100 000 (1 ~ e i 5Y2) = 0 95 < 100 000 = 95 000 Varx (2) =
D1 000 X 0 05 == 4710 crr (2) = 68 V
\Jote that the max1mum var1ance Varx (tm) = 0 25n = 25 000,
0 (lm) === 158
;t"

At the moment t ::;;::;: 2 J e1.rs the number of terms tn the directory IS a


random v1r1able A (2) \\ htch 1S appro>..Jmately normally dtstr1buted
"1 th the characterlSttcs obta1nod above Therefore, P {X (2} >
0 Un} ~ 1 Since lnx - 3ax > 0 9n
tO 38 \Ve constder 1 1nore general case for the operation of a data
bank dtrectory The fi.rst comphcat1on as compared to the hypothesis
of Problem 10 36 Is that the max11num number of terms n tn the dtrec
tory IS not constant but 1. funct1on of t1me t n.. (t) (tn the case of a dl
rectory of the names of org1.nn:at1on..:; th1s means that the total number
of organ1sat1ons var1es '\ lth t•me 1ncreastng or decreas1ng)
In ndd1t1on, at ~orne potnt In t1me a term fed tnto the dtrectory 1s
deleted from lt because the term becomes obsolete It 1s assumed that
tl1e flo'v of deletlon'5 1s ., Po1sson process \Vlth 1ntens1t} ~L (t), wh1ch
1~ the same for all the terms 1n the dtrectory
In that case the 1ntens1ttes of the b1rtb and death flo\vs have the form

iwh ( t) = i ( t) ( 1 - k
n (t}
) Jlk ( t) = ~ ( t) h. 1 ( 10 38~ 1}
and equat1ons (10 0 24) and!{10 0 25) assume the form (the relat1ons
of the function m~ (t), Varx.,(t) n (t),!l~ {t), 1.1 (t), Pk (t) and t1me t are
o n11tted for the sake of brevt ty)
dm:.t/dt = A - mx f'A/n + ~t),
dVaro:ldt = 1.-- m:t (Ain ~ f.L) - 2 (Ain + ~) Var.x• (10 38 2)
li the qua.nt1\1es A., n, f! are constant {do not depend on t1me), then,
as t ). oo, stattonary cond1t1ons are poss1ble for \Vhic.h dm~/dt ~
dVarxldt = 0, whence ~
-
m;a:=n(t +JL:·r 1
, Var"=m"(t !-- ~~ r 1

CHAPTER 11

Queueing Theory

rv e cu st o m er s (d em an ds ) ar ri v in g
qu eu ei ng sy st em is on e d es ig ned to se h o n e ex ch an g e,
11.0. A of q u eu ei n g sy st em s ar e a te le p
at random m o m en ts . E x am p le s p u te r sy st em . Q u eu ei n g
ai rd re ss er 's , an in te ra ct iv e co m
a garage, a bo ok in g- of fi ce , a h in q u eu ei n g sy st em s.
o ch as ti c processe s ta k in g p la ce e
theo ry de al s w it h th e st ll ed a se rv er (c ha nn el ). T h er
h ic h se rv es cu st o m er s (d em an ds) is ca n g sy st em s.
A facility w i- se rv er (m u lt i- ch an n el ) q u eu ei
r (o ne -c ha nn el ) an d m u lt -s er v er sy s-
are single-serve a t th e w in d o w is an ex am p le o f a si n g le
A booking-office w it h o n e m an in d o w s is an ex am p le of a
fi ce w it h se v er al m en a t th e w ••
tem, while a bo ok in g- of
multi-server system. ng estion syst em s (s ys te m s w it h re fu sa ls ) an d delay
We al so d if fe re n ti at e b et w ee n co w h en al l th e se rv er s
ar ri v in g a t a co n g es ti o n sy st em
queueing sy st em s. A cu st o m er p a rt in an y fu rt h er p ro ce ed -
e an d d ep ar ts w it h o u t ta k in g
~re busy is re fu se d se rv ic
v in g w h en al l th e se rv er s ar e b u sy
el ay q u eu ei n g sy st em , a cu st o m er ar ri rv er to be come
mgs. In a d u t jo in s th e q u eu e an d w ai ts fo r th e se
does not le av e th e sy st em b be li m it ed o r u n li m it ed . If
ce s m in th e q u eu e m ay ei th er
free. The nu m b er of p la st em . A q u eu e ca n be b o u n d ed
ay sy st em tu rn s in to a co n g es ti o n sy e qu eu e) an d
m = 0, a d el cu st o m er s in it (t h e si ze o r le n g th of th
~oth in terms of th e n u m b er of e k n o w n as "s y st em s w it h
ti m e (s y st em s of th is k in d ar
m ter~s of th e q u eu ei n g
Impatient customers"). ed in te rm s of th e qu eu e di sc ip li ne , i. e. cu st o -
Delay sy st em s ar e al so su b d iv id n d o m o rd er , o r so m e
th e o rd er of ar ri v al o r in a ra
mers may be se rv ed ei th er in p ri o ri ty se rv ic e) . A p ri o ri ty
o b ta in se rv ic e before o th er s (a
cust?mer s m ay be ab le to p ro n to s.
d at io n s, o r ra n k s o f
sernce may h av e se v er al g ra of a q u eu ei n g sy st em ca n be p er fo rm ed in a si m p le st
The an al y ti c in v es ti g at io n tr an sf er ri n g it fr om st at e to st at e ar e el em en ta ry
way ~vhen al l th e fl ow s of ev en ts s b et w ee n th e ev en ts in
m ea n s th a t th e ti m e in te rv al
(stahonary P o is so n 's ). T h is et er eq u al to th e in te n si ty
en ti al d is tr ib u ti o n w it h a p ar am th
the flow have an ex p o n is as su m p ti o n m ea n s th a t b o
po nd in g fl ow . F o r a q u eu ei n g sy st em th e us e th e te rm
of the corres es s are st at io n ar y P o is so n 's . 'V
oc es s an d th e se rv ic e pr oc er b y on e con-
the arrival pr in g cu st o m er s ar e serv ed on e af te r an o th
s;n·zce proces s w he n th e ar ri v n 's o n ly if th e se rv ic e ti me
h is pr oc es s is st at io n ar y P o is so
tfnuously bu sy se rv er . T an ex p o n en ti al d is tr ib u ti o n .
0 r is a ra n d o m v ar ia b le w h ic h h as se rv ic e ti m e:
a customer T se
is an in v er se of th e av er ag e
The P~ameter 2. f th is d is tr ib u ti o n
1-L
in g "t h e se rv ic e pr oc es s is st a-
= M [T se rl · In st ea d of sa y fo ll ow s we
~~ = lf ts er , w he re rs er
y "t h e se rv ic e ti m e is ex p o n en ti al ". In w h at
t;onary Poiss on 's " w e ca n sa
h ic h al l th e pr oc es se s ar e st at io n -
r b re v it y , ev er y q u eu ei n g sy st em in w l m ai n ly d ea l
s !all call, fo sy st em . In th is ch ap te r w e sh al
n' s an el em en ta ry q u eu ei n g
ary1Poisso
" 1 I elementary q u eu
1 ei n g sy st em s. , th en th e pr oc es s in a q u eu ei ng
ts ar e st at io n ar y P o is so n 's
If all the flo w s of ev en
oc es s ~v it .b dis~rete st at es an ~l co n ti n u ?u s ti m e.
as ti c pr ess
~;htem is a_ Mar ko v st o ch
e fu lf il le d . a h m 1 t st a tw n a ry st at e ex ts ts fo r th ts pr oc
. en certam co n d it io n s ar
r th e o th er ch ar ac te ri st ic s of th e
th er th e p ro b ab il it ie s of th e st at es n o
IU which n ei
process depend on ti m e. d ea ls w it h in cl u d e fi nd in g th e p ro b ab il it ie s
~ proble ei n g th eo ry th e
f Th m s th e q u eu
g sy st em an d es ta b li sh in g a re la ti o n sh ip b et w ee n
0
Yanous st R te s of a q u eu ei n e ar ri v al pr oc es s 1.., th e dis-
er s n, th e in te n si ty of th
P~ram.eters (t he n u m b er o f se rv
on ) an d th~ ch ar ac te ri st ic s of th e ef fi ci en cy of
~hlbt~ho~ of th e se rv ic
F
e
o r
ti m
ex
e,
am
an
p le
d
.
so
w e ca n co ns 1d er: e
e -e rn ce sy st em . q u eu ei n g sy st em p er u n it ti m
er s A se rv ed b y a
or l~~ averag e n u m b er o f cu st
th e
o m
sy st em fo r se rv ic e;
l e absolute capa ci ty of
361} l ppll cd P ra ble ms 1n Proba. bllUy Th tory

the probab1hty of serv1ng an arr1v1ng customer Q or the rtlative capadty o!


the .s:,. stern for serY tee Q = AlA
tho probab1hty of a refusal Prl!f 1 e the probab1hty that an arrtving custome-r
w1ll not be served Pr~J = 1 - Q
the average number or customers present 1n a system (betng .ser;;ed or 1n the
-
queue) :
the average number of customers 1n the queue r -
the average v.n.iting llme of a customer (tho average t1me a customer JS present
1n the- s~stem) t" -
tho average queue1ng time of a customer (the avera.ge ttme a customer spends
1n the queue) tq _
the a\erage number of bu:sy ser' ers k
In a general case all these character1sttcs depend on ttme But many serv1ce
Slstems operate under the snme eondillons !or a sufficJently long bme and there
fore a st t ua tton c] o.se to ~ta tJ onn ry b established \Vt thout speetfy1ng 1t every tl.Dle
\\ e sha 11 every'' here ea IcuI ate l1 m 1t1 ng pro bah 1ltttes of the states and the h ml t1ng
charactertstlcs of the efficJeocy of a qucue1ng system w1th respect to 1ts lun1t1Dg
steady state operation
For any open •} queue1n g system opera t 1ng under I1m 1t tn g s t a t1 o nary cond 1hons
-
th.e n 1o erage ~\o.aJ tJng tJme Dl a customer ~~ Js expressed In terms oJ t.he .avPrage num.
her of customers tn the system by Ldtle s formula
(11 0 t)

"'here A Js the 1nten~nty of the arrtval proees!l


A stmJlar formula (also kno'Wll as L1ttle s formula) relates the average queue1cg
time of a cu~tomer tct and the average number r of customers 1n the queue
tq r ).. (ii 0 2)

L1ttle s forn1ulas are 'ery useful s1nce they mal.e 1t pos~nble to calculate one
of the t" o character1.~ttcs of the efficH~nc} (the- a' erage queueing tlme for a customer
and the average number of customers) wlthout nece~Q'arlly calculahng the other
It should be emphasized that formulas (1 1 0 1) and (ii 0 2) are vahd for any
Dptn queueing systtm (s1ngle 5erver multi server and for any kind of arr1val and
serv1ce process) the only requ1rement he1ng that the arr1val and serv1ce processes
mu~t be stah.onary
Another very Important e'tpress10n for an open queue1ng system IS a formula
v.h1ch expresses the average number of busy :servers/... ln terms of the absolute
capac1ty of the S) Qtem for service A
-
J. = Alfl (1 t 0 3)

\v here J..t = 1/i;er 1s the 1n tens 1t y of the Berv1 ce process


'Ianl problems Ill queuetng theory conccrn1ng elementary queue1ng systeiil.S
can be so]ved using a h1rth and death cha1n (see Chapter 10) If the drrected graph
of states of a queueing system can be represented as tn F1g 1 t 0 t then the hmttlng
probah1htles for tho states can be e"tpressed h) formulas (tO 0 23) 1 e

Po {t+ i..o + lo~t + + h~At h~ t + 1


AoA1 ~n 1 }-
1

flt ~1 r.Li J.ldt1 llli Jlt Jl2. J.l )1

A queue1ng system ts sa1d to he open lf the 1ntens1ty of the arnval process


*)
does not depend on the state of the system 1tse1f.,
Ch. 11. The Queueing .Theory 365

(11.0.4)
When deriving a formula for the average number of customers (in a queue or
in a system), we often use the technique for differentiating a series. Thus if
z< 1, then
00 00 00

d d d
~
X X
~ kxh=x ~ dx
xh=x
dx
xh=x
dx 1 - x - (1-x)2 '
R.=1 h=i 1<=1
and finally
().)

X
k xh - --.,...---.,. ( 11.0.5)
-(1-x) 2 '
h=i

Below we shall present without proof a number of formulas for the limiting
probabilities of states and the characteristics of the efficiency for certain often en-
countered queueing systems. Other examples of queueing systems will be con-
sidered in the form of problems.

i!o A.t ..iz Ax-t .il.x


I ~ So '"
fit
QJ: ~ I zl
/lz
S < '" oe
Jls
0
:

Ilk
> I S I~ ~
k
Pk•t
o o o "" .... ,
fln
Sn

Fig. 11.0.1
1. An elementary congestion system (Erlang's problem). Consider an n-server
system with refusals at which customers arrive in a stationary Poisson process with
inftehnsity A.; the service time is exponential with parameter f.L = 1ltser· The states
0 t e system are numbered in accordance with the number of customers in the
system (since there is no queue it coincides with the number of busy servers). Thus
we have
so-the system is idle;
st-one server is busy and the other servers are idle; ..• ;
sn-k servers are busy and the other servers are idle (1 ~ k ~ n); ••• ;
sn-a~l n servers are busy.
The hmiting probabilities of states are expressed by Erlang's formula:

p= { 1 + f, + ~~ + . . . + ~~ }- 1 ;
ph (11.0.6)
Ph= kl Po (k=1,2, ... , n),
wl:me P== 1./!J.. ·
The characteristics of the efficiency are
A=A. (1- PnJ, Q= 1- Pn, Pref = Pn, k= P (1- Pnl· (11.0.7)
en for large v~lues of nit is convenient to calculate the probabilities of states giv-
n (11.0.6) m terms of the tabulated functions:
am
P (m, a)= e-a (for a Poisson distribution) (11.0.8)
m!
and
m


R (m, a)= h ak
kl'·
e-a (11.0.9)
h=O
26G A pp l1ed Pro ble m1 tn Pro bah fltty Theory

(~ee App~udtces 1 and 2) the first of whtch can be expre-~sed In terms of the second
P (m, a) :::;:;: R ( m a} -- R ( m - 1 a) (11 0 10)
Us1ng these functlonCil. v.e can rewr1te Erla.ng s formulas (11 0 6) in the form
Pit. = P (k p)/R (n p} (k ::::: o. i . n) {it 0 11)
6 ,

2 An elemenla.r) s1ngle- srr\ er S)stem ,.ith an unbounded queue Cons1der


a stngle aervPr system at "hu::h customers arr1 ve 1n a sta honary PotSson process
"1th lntenslt} A The ~~rv1te ttme 1s etponent1ul "1th parameter J.1. = tliser The
length or the queue IS unhmtted The lJmlbng probahthtu~~s eXISt only for p
)..If! < t (for p ~ 1 the que-ue 1ncreaces tndefinitely) The states of the SJ.f!tem
are numbered ln accordance "lth the number of customers who are 1n the queue
or be1ng served Thu~
s0 - the s}stem IS 1dle
s1 ~t1Je server 1s busy ,and tb~:re lS JJO queue
s 2 -tbe serve-r 1s busy and one customer 1s '' a1t1ng to be served
!tt-tbe server ts busy and k - i customers are 1n the queue
The hm1ting probilhlhtu~s of states ar& expressed by the formulas
p = 1- p Ph - pk (1 - p) (k ~ t 2 )• (11 0 12)
"here p = AI~ < 1
The characterlstlCS of the effic1ency of the system are
A ~ A. Q= t Prtf = 0 (t 1 0 13)
- p
~,. = )(f-p}
{it 0 14)

and the average numb~r of busy ser' er• (or the probabl11ty that the ~erver IS bus}) IS

I= ~IJ.L =o (il 0 i5)


3 An elrmentar~ ~1ngle ser\ttr s)stem "1th a Lounded queue Cons1der a s1ngle
c:;erver queue1ng S) stem at \\htch customers arr1ve 1n a 5tahonar) Po1sson proce~s
Wl th J n tens1 t} A the ~ erv 1ce tIme 1s expo ncn t 1a I w1th par a meter J-l = t 1i;er There
are n places tn the queue If a cu~tomer an 1 ,~s "hen all the plac~s are occup1ed
he 1s retused ~ervlC(l and departs The ~tatc5 ol the system are
s.,.-the system 1s 1dle
s1-the server IS busy and there lS no queue
.r2-tbe server Js busy and one cucotompr 1S tn the queue
sk-thfl' .:;er\f'r 1s bus} and A - j cuQtomer!i' are 'n the queue
sm+1 ---the ,;::erver IS bus) nnd m customers are 1n the queue
The hm1tlng probabJht cs of states e:x1st for 1ny p = AI~ and are
1--p
Po= _ pm•a
1
Ph~ ph Po (k = 1 ", m 1}. + (11 0 15)

The charact(\rlshcs of the efficiency of tb.e system are


A = A (1 - Pm+t) i - Pm•t
Q = Prer - Pm+t
The average numhet of busy ~@t\ ets (the p:robab1hty that the server IS busy) IS

f: = 1 - Po (ff 0 f7
The average number of customers 1n the queue
r= p2 [t-pm (rn+1-mp)] (1 f 0 18)
(1- pm+f) (1- P} •
Ch. 11. The Queueing Theory 367

he average number of customers in the system


-z=r+k.
- - (11.0.19)

By Little's formula
(11.0.20)

4. An elementary multi-server system with an unbounded queue. Consider an


n-server queueing system at which customers arrive in a stationary Poisson process
with intensity },; the service time of a customer is exponential with parameter
fl = 1/tser· Limiting probabilities exist only for p/n = x < 1, where p = 1./f.!.
The states of the system are numbered in accordance with the number of customers
in the system, hence
s0 -the system is idle;
s1-one server is busy; ... ;
sh-k servers are busy (1 ~ k ~ n); .•. ; there is no queue
sn-ail n servers are busy;
snn-all n servers are busy and one customer is in the queue;
sn+r-all tne n servers are busy and r customers are in the queue.
The limiting probabilities of states are given by the formulas
p pn pn+l 1 1- 1
Po= { 1 + 11 + ••· + + nl n·nl 1-x J '
pk
Pk= kl Po (1 ~ k ~ n); (r~ 1). (11.0.21)

Using the functions P (m, a) and R (m, a), we can reduce (11.0.21) to the form
p (k, p)
Pk =-----:....:.....!..!.----:- (k=O, ... , n);
R (n, p)+P (n, p) (
1
:_x) (11.0.22)
Pn+r=xrpn (r=1, 2, ... ).

The characteristics of the efficiency of the system are


-r=pn+lp /[n·nl (1-x)
0
2 ]=XPn(1-x) 2 ; (11.0.23)
-z=r+k=r+PI
- - - (11.0.24)
Tw=z!'A; ~=rf/,. (11.0.25)

{hAn elementary multi-server system with a bounded queue. The conditions


~f t e n~mbering of the states are the same as in item 4 except that the number m
an~lnces m the queue is limited. Limiting probabilities of states exist for any A.
ft and are expressed by the formulas
p pn pn+l 1-xm }-1.
Po= { i+ 11 + · · ·+ + nl n·nl 1-x '
pk
Pk= kl Po(1~k~n);

pn+r
Pn+r= r 1 Po (i~r~m), (11.0.26}
n ·n.
\\here Y.=pfn='J..J (nft)
.388 4ppl1ed ProblemJ ln ProbabllUy TbeDTV

The charactcr1sttcs of the cUie1cncy of the system ate


-
A= i. (1- Prt+m), Q = 1- Pn+m Pret = Pn+m~ k=p (1-- Pn+m} l11 0 27)
- pn+lp f -tm+i) xm+mxm+J
r== ll t (11 0 28)
n n! {1- x~t
-z::-r+k
- - {Jj D291
iq ===;. n~ :.,. == %tA (i i 0 30)
6 \. multi serv~r congrsUon s}sfenl for a slahonary ro1sson arrival and an
arbitrary ser't Icc time Erlang s formula (1 1 0 6} rema1ns v1hd 1n the ease \\h~n
-
the arr1 val process J5 sta ttonary PoJsson s all d the serv LCQ- lL me T ser has an arb1
trn.ry d1str1bu hon \V[th expecta t1on t$er = 1 'll-
7 A s1ngle server system '' 1lh an unbounded queue (or a stationary Po1sson
arr1val nnd an nrb1trary servu~e time If thf3 arrLvals at a stngle server queue1ng
S}.stem eome lD u stntLonat) Potc.s }0 no,\ \\ lth lnleU.9lty A and the !(lfVlCe tJme fs.e-r
has an e'tponen tlal d1str1 bu bon \\ 'th {l::tpccta t1on 1/~t and the coeffictent of var1a
h n u.,a. then the average n urn her of customers 1n the queu~ 1s e'tpr~czsed by the
Pollac~ek Kh ~nthine forn"tu l a
-r = .,~ {t + Lp)'Zf(z u - Pllt tt t o 31}
\\here p = )Jll and the average numhpr of cu-=tomers 1n the S\ stem
-z = {p~ {1 + t~.J~/t2 (1 - p)]} +p (11 0 32)

U!nog L1ttle s forn1ula \\ e get from (11 J 31) and (11 0 32)
- p'l (! -- t u 2 P' ~ 1 --LJJ') f
t - f-'0 t ==- J.L +~ (11 0 33)
q ~ 21.. t 1 ----. r) 2A. rt - p p. •
8 A single server queuein~ s}~tem
ror an arb1trary (P'almt arrival procesJ
and an arbitrary service tlmeo Th(lre are D() exact ra-rmulns for th1:s case, nnd an
approxJmatlon for the length of the qupue 1s gJven by the formula
;- ~ P' (vi + vJ)/l2 (I - p)] (11 0 3~)
,,. here- va th.o coe I.fi cJ en. t oi varJa t.z on v1 the 1n t erarr1 va1 t 1m e ~ p ~ )J1J. A J ~ the
J.s
1nverse of the Pxpectati.OQ of that tnne 11 = 1/i;er IS the 1ove-rse of the average
~er v 1ce t Jme ~.oiJ. 1s the co e f fi. c1en t of v a tl a h on of t be s erv 1ce tl ma The a.\ era ge n um
ber of customers 1n the queU~?lng system
... ~ {p2 (vi + v~)/[2 (1 - p)l} + p, (1 t 0 35)

1.nd the average queuelng and \Valhng t1mes of a customer are

and
-
tq ~ p2 (t-l + v~)/[21.r (1 - p)}, (1 t. 0 35)

-t"- ~ {p 2 {ul + ~J)II2A. (1 -- p)]} + i/}l (t1 0 37)


respettl~ ely
9 An clclnenta.ry mult1 phase delay s')stem It 1s d1focult to annlysc multt
phase queue 1n q s ~ $ t ems 1n the general case s 1n ce tho a rr 1v a1 prot ess of eacb .succes
litl v e p base lS the de par! ure proc. ess oI the preeed l ng p ha '-e a. n d there are ~ Itereif rc ts
1n the general c1::,e Ho¥. ever if cusfome~s a.rru;e i~ a sfa tta no. ry Po'S$On p~ess ol
a queue lng tystem u tth an u nbo u.nded queue and the serrnce. ttme 's exp on.en.Ha l thtn
I he thporJu re proc.ess Is stat ~onary Pousan. s lll th the Sliill~ lnten..,.:ty A. as I b.e arr1' Jl
It foUo,vs that a multt phase qucue1ng S) ~tem "1th un unbounded queut! b(l(oro each
phase a ~tahonary Poisson arrtval and an e~ponent1al serv1ce tnno at each phase
can be ana1y~Pd M .a s.1Jll pl e .s ~que nc P of e] e1nen tal"} qu eu e1 ng .s }'"!terru
Ch. 11. The Queueing .Theory 369

H the queue before a phase is bounded, then the departure from that phase is
olonger stationary Poisson's and the above technique can only be used for ap-
pro:dmations.
In what follows we shall use the notations for the efficiency characteristics as
given on pp. 363 and 364, hut we shall introduce new ones when needed. When we
define the density f (x) on the x-axis, we shall not indicate the values of f (x) on
the boundaries of those parts.
If the unit of time is not fixed, we shall designate, for brevity, the intensities
of the processes simply by letters, e.g. 1.., fl, ..• {without indicating the dimen-
sions). The same refers to the time ~v and tq. Now if the unit of time is fixed (a min-
ute, an hour, a year, etc.), we shall indicate the units of measurement.
Rather than using a distribution function F (t) it will he more convenient in this
chapter to write the distribution of a mixed random variable T as a "generalized"
density f (t) which is defined as
I (t)=F~ (t)+ 2J. Pl) (t-ti),
t

where F~ (t) is the derivative of the distribution function on the intervals of its
c~ntin~ity, Pi=~ {T = ti}, and t'l (x) is the delta-function whose properties are
given m Appendix 6.

Problems and Exercises


1.1.1. Consider a single-server congestion system at which customers
a:nve in a stationary Poisson process with intensity A, the service
~~~~ being exponential with parameter !l· At the
lmhal moment t = 0 the server is idle. Construct
the marked graph of states for the system. Write
and solve the Kolmogorov differential equations
for the probabilities of states of the system. Find
the limiting probabilities of the states and (for Fig. 11.i
steady-state conditions) the characteristics of the
efficiency of the queueing system, i.e. A, Q, P rer• k.
Solution. The states of the queueing system are: s 0 , it is idle; s1 ,
the server is busy. The graph of states is shown in Fig. 11.1. The Kol-
mogorov equations are
dp 0 1dt = -APo +
!-LP1• (11.1.1)
'l Since Po+ PI= 1 for any t, we can express p 1 in terms of p 0 , i.e. p 1 =
-Po, and get one equation for p 0 :
dp 0 /dt = -('A + !l) Po + ~t. (11.1.2)
Soh·ing this equation, >Ye get p 0 as a function of t:

hence
Po(t)= f..t!l [1 +
Pt (t) = 1- Po {t) =
When t --+- oo, we get limiting probabilities, Yiz.
Po = ~t/(1, + !1), P1 = /J(/. + ~t). (11.1.3)
370 Applied Problem1 !n ProbalJflity Theory

\\ e can find them much more east] y by c:.ol v 1ng aIgebr a 1c Iinear equa
t lOns for the 11m tlJng probah1l 1t 1es of state~
A.po Po + P1 = 1
= llP1
'Vc can re't\ rt te formulas (11 1 3) tn a more conctse form 1f we lntro-
duce a deslgn 1tlOD r = Alfl
Pe = 1/(1 + p) P1 == p/(1 + p)
The char"'lctertsttcs of the efficiency of the queue1ng system a:re

A=?l.pa= t~p Q= i~p Pnr=Pt= t+.p '


- p (f f 1 4)
k=1-po = t+p
j 1 2 G1' en , t e) epl one 11n n "h1 ch 1s a ~1ng Ie server congest 10n sys
tom calls arr1' e at t c; tnput ln a stnttonary Poisson proce s The trafiie
1ntens1ty 1s "'- - 0 4 calls per m1nute The a' erage duration of a con
versat1on t5 er = 1 mtn and has an exponential distribution F1nd the
l1mlt1ng prob1btl1t1es of the states of the system p 0 and p 1 as '\\ell as
A Q P ret and k Compare the capac1ty of the system for servu~e and
Its rated capac.1ty If each con' crsnlton lasts exactly three minutes and
the calls arrt ve regularly one after another Without a break
Solution l ~ 0 4 ll - 1fls~r = 1/3 p = All-l = 1 2 By formulas
(111. 3) Pfl =::: 1/2 2 ~0 455 p 1 ~0 545 Q ~0 455 A= '-.Q ~
0 182 k = p 1 ~ 0 54.5
Thus on the a" erf)ge tl o telephone ltne wtll serve 0 455 calls 1 e
0 182 calls per mInute The r 1ted ca pac.I t y of the server would be (1f
the calls arr1ved and '\ere ser\ ed regularly) A rate = iftaer = 1(3 ~
0 333 callslmtn and that 1s almost t'' 1ce as large as the actual ca
pactty A
11 3 \\ e aro given a single serv-er congest1on system \Vtth customers
aJ"rlvJng Jn a qtatJonary Po1sson process w1th 1ntens1ty A. Tbe .se.rvJce

t~tr t,~ tser


..t .
D
~

®
it
=~ •=•
'-
'toT
r
®= ~
-'t2 • ~. -v @~

. . "ts
A


lj_

, t

Fig t 1.3
t1me ts nonrandom and lS exactly tser = 111-1 F1nd the relattve and
absolute capac1ties of the S) stem for service 1n the l1m1t1ng stationary
state
Solutton Let us constder a stattonary Potsson arr1val procecos \VJth
tntens1ty k on the t ax1s (F1g 11 3) \Ve shall denote all the customers
\vho are served by ctrcles Assume that a customer "ho arrtved
at the moment t 1 1s he1ng served Then all the custome_rs
who arrtve after htrn during the t1me t ser wtll not be served The next
Ch. 11. The Queueing Theory 371

to be served be the CUStomer who arrives at ar moment t 2 such that


'\Vill
t2 - t1 > tser· Let us consider the interval T between the end of the
first customer service period and the moment t 2 the next customer to
be served arrives. Since there are no aftereffects in an elementary flow,
the distribution of the interval T is the same as that of the interarrival
time in the general case, i.e. exponential ·with parameter 'A. The average
length of the interval T is m t = 1/"A.
Thus nonrandom busy periods of the server (tser = 1/f.t long) and
random idle periods (average length 1/lv) will alternate on the t-axis
A fraction of all the customers
1/!-1 f..
1/fl+ 1/f.. - f..+ f1 '

will fall on the first intervals and a fraction


f.t/(lv + f.t) = 11(1 + p), where p = lv/f.t
will fall on the second intervals. The latter quantity is the relative
capacity of the queueing system for service
Q = 1/(1 + p), (H.3.1)
whence
A = lvQ = !vi( 1 + p). (1'1.3.2)
Note that formulas (11.3.1) and (11.3.2) coincide with (11.1.4), which
co~re.sponds to the exponential distribution of the service time. And
th1s IS natural since Erlang's formulas remain valid for any distribution
of the service time with the mean value equal to 1ht.
11.4. Using formula (11.0.5), prove that for an elementary single-
server system with an unbounded queue the average number of cus-
z
tomers being served iR = p/(1 - p), where p = lv(f-1-, and the average
number of customers in the queue is = p2 /(1 - p). r
(k ~lution. By formulas (11.0.12) p 0 = 1 - p, Pk = pk (1 - p)
- 1' 2, ... ).
We designate the actual (random) number of customers in the sys-
tem as Z:
00 00 00

Z=M{Z]= 2J kpk= 2J kpk(1-p)=(1-p) 2J kpk.


k=O k=1 k=!

By formula (11.0.5) for p < 1


00

"" kpk-
LJ p
- (1-p)2 .
k=1

~h_:nce Z= p/(1 - p). The average number of customers in the queue


IS" • -
. "~Inns the average number of busy servers k = Alf-L = 'J../~t = p,
=
I.e. r [p/(1 _ p)] _ p = p2/(1 _ p).
2~·
372 AppJJed Problems In ProbabJIJty Theory

1f 5 A rat I\\ ay ~o.hunt1ng J ard 1s


a sLng1e-ser' er queuetng system
\Vlth an unbounded queue nt "htch tratns ntrt' e 1n 'l stationary Po1sson
process The traffic 1ntens1ty JS .A ~ 2 tra1ns per hour The .serv1ce t1me
(shunting) of a tt-a1n at the lard has an c'tponent1al distribution v, 1th
mean value t;er
= 20 mtn ftnd tho lim1t1ng probahtllties of states of
the S) stem, the average number .z of tra 1ns at the yard, the average
-
number r of trruns In the q11eue the a, crage "a1t1ng t1me tw of a tratn -
1nd the average queuetng ttme fq of a tratn
Solution ).. = 2 traJn.s/h ~er === 1/3 h ~L = 3 tratns/h, p = 'JJp. =
2/3 D} formulas (11 0 12) "e have Po = 1 - 2/3 1/3 p 1 ~ =
(2/3) (t /3) = 2'9 p 2 = (2/3)~ (1/3) = 4127, , Pk = (2/3)~ (1/3) etc
z
By formulas (11 0 13) and (11 0 14) = p/(1 - p) = 2 tralnSf = r
4/3 tratns Tw = 1 h and fq == 2/3 h
11 6 The h} pothe~ts of tho preccdtng problem ts compllcated by the
condttton that nQ more than three tratns can be present simultaneously
In the ~hunting ~ ,rd (Includtng the tra1n being served) If a train
arr1ves at a moment 'vh~n there f{fe three trains 'l.t the yard, tt has to
JOln the queue out~ 1de t 1u} 5 1rd on a s1d e tcacl. The sta. tton has to pay
a fine of a roubles p~r hour 1f a tra1n st 1.ys on the stde track F1nd the
a' erage ftne per day the stat1on has to p1) for tratns staying on the
stde track
Solution '' e calculate the a' eragc number of tra1ns Zs!de on the
stde track
co
- (.'0 00

= 2.J 1.. p,_ - ~ kp" Po = Po }] ;{p11 ,


Zstde = i P~t. + 2ps + h~'- ll=-l h=i
'OD co OG

~~~
~ kp = p ~
11

h~i
:p ,l = p ~
A~l
d p"~p d
dp -
~p~t.
dp LJ
11=6

B, LJttlo s formu1a the average t1me n trazn stays on the su}e tr.ack
-
tsJd@ ~ 1 18/~ = 1 18/2 = 0 59 h On the n'erage 24 A = 48 tra1ns
a.rr1 ve at the st a tJ on C\ ery 24 hours The a' erage datly fine JS 48 X 0 59 a~
28 3a
11 7 Us1ng the btrth and death q,cheme, calculate dtrectly from the
directed graph of st ,_ te.s the l1m1 t1ng pro ba b 1l1 t1es of states for a SliD pie
two ser' er queueing system (n = 2) wtth three places 1n the queue
(m = 3) for k ;;:: 0 6, 11 = 0 2 and p = 'AfJ.t = 3 Ftnd the character
lStlCS of thiS queUeing system l e Z, r
tw, tq Wlthout USlDg formulas
(11 0 26) proceedlng rather from the ltmtttng probabtlrttes, and com
pare the results 'vtth those obtatned from (11 0 26)
Ch. 11. Th e Q zw ue in g Th eo ry 373

st at es of th e qu eu ei ng sy st em is
Solution. The di re ct ed gr ap h of io n A/1-1 = p, w e ge t th e
e de si gn at
shown in Fig. 11.7. In tr od uc in g th
~
~ Ss
..;. .
Sz SJ S~t
So St
2p. 2p 2p

Fi g. 11.7

us in g a b ir th an d de at h ch ai n:
following, -

Po= { 1 +p + p2
2
+ + 23 + 2
p3
22
p~ p5 } -1
-I =
1
(40.58)- ~ 0.02::>,

:0 ~5 8 ~ 0.165,
5

~ 4 8
6: ~ ~
40 58
~ 0.074, p2 = 0.111, p3 = •
Pt =
15.18
10 .1 5
~ 0.250, p5 = _ ~ 0.375,
p4 = . 40 58
40 58
X 0 .1 6 5 + 4 X 0.250 + 5 X 0.375~3.67,
;= 1 X 0 .0 7 4 + 2 X 0 .1 1 1 + 3
0.375 ~ 1. 79 , tw = z/ 0 .6 ~ 6. 11 ,
r= 1 X 0 .1 6 5 + 2 X 0 .2 5 0 + 3 X
I
I

) tq = rJo.G ~ 2.98.
I v al id fo r an y x < 1 or x > 1.
11.8. The fo rm ul a fo r- r (H .0 .2 8) is an un de te rm in ed va lu e 0/ 0) .
For x = 1 it is no lo ng er va li d, yi el di ng

.l ,1, .r---._;.;1.
So
~
s, Sz eoe :1 t:eo•:n;ti
kJL
Sk Sn ~oee
np
Fi g. 11.8
ch ai n de ri ve th e pr ob ab il it ie s of th e
Directly from th e b ir th an d de at h fi nd th e ef fi ci en cy ch ar ac -
is ca se an d
states p 0 , p 1 , • • • , Pn +m for th
teristics of th e qu eu ei ng sy st em , i. e. A , Q , Pr er • k, tq
sy
,
st
tw
em
.
ha s
r, z,
st at es of th e qu eu ei ng
l th Solution. The di re ct ed gr ap h of
th e ge ne ra l fo rm ul as fo r th e b ir th
e form sh o\ m in F ig . 1'1.8. U si ng = p, w e ob ta in
and death scheme an d de si gn at in g !J1-1
pn 011+1 pn+m } -1
Po= { t + 1! + 2! + · · · + nl + · +
P P2 n· nl • • • -L
' -'-n=m-.n-:-1
,I
{t r p r n r
= + 11 + 2! + •.. + n! ~ n + nr
2 (P)z, I .. •
(r
+ n)m]}-1 •
'

For x = p in = 1
pn mpn } - 1
( 11.8.1)

.'
P o= { 1 +
p
1! + 2! p2
+ .. . T n! + nl '

-- -- - - ~--
874 Appllet! Prcbl~mJ ln. Prt:Jbabllitv Theorp L

(11 8 3)
z:=:r+k, Tw=zi'A, tq=r!Jv.
t 1 9. A petrol station has two petrol pumps (n = 2), but 1ts forecourt
can only hold four \Va1t1ng cnrs (m = 4) The arr1val of cars at the stat1on
1s statJon.nry Poisso n's WJth 1ntens1 ty :h.= 1 earlmi n The service t1me
of a car 1s expone nttal "1th mean values tser = 2 m1n F1nd the ltmlt·
1ng probab tlltlcs of the states of tho petrol stat1on and 1ts characteristics
- ........... - ~

A , Q ! re rt k z, r, t vn t q
1

Solut1o n A, = 1 11. = 1/2 = 0 5, p = 2, x = pin = 1 From for-


mulas (11 8 1) (11 8 3) \\C find that
1 2' 21 1
+ }-1
Po = { + 2 + 21 2l 4 = 13 '
2
Pt = P2 ~ P3 = p., = Ps = Pa = 13 '
Pret = 2/13t Q = 1- Ptef = 11/i3, A .= lQ = 11/13
~ 0 85 cars/m1n,
J: = A/1• = 22/13 ::::1- j 69 pumps,
- 23 4 (4+ i} 1 - - -
r= ~154 cars, .z=r+A.~323cars
21 2 13
t 1 10 Custom ers arrl\ e server conges tton s}stem at a rate
'lt a two
of ;., = 4 custom ers per hour The a\erag e service t1me tser = 0 8 h
per custom er The Income fron1 every custom er served c = 4 roubles
The upkeep of every server 1s 2 rouble s/h Dectde whethe r It Is advan
tageou s to 1ncrease the number of servers to three
Solut1on By Erlang 's formul as (11 0 6)
Pa={1+32+ 3
2 rt 1
=<9 32t•~o 101, p 2 ~-!!;-~osso,
Q = 1 - p2 ~ 0 450, A = 4Q ~ 1 8 custom ers/h
The 1ncome from the custom ers 1n the gl\en var1an t lS D =A c ~
7 2 roubles lh
\Ve calcula te the same parame ters for a three server queue1ng system
(ileJJ£11. . .1>-"Qg tbe.m .bJ" /?F}NW!i)
p; = { 1 + 3
3 2+ .}/ + T' }3
3
-t ~ o 0677. p; ~ 5 48 x o 0677::::::0 371,
Q' ~ 1 - P3 :;:::::; 0,629, A I = 4Q' ~ 2 52,
D' =A' c ~ 10 08 rouble s/h
Ch. 11. The Queueing .Theory 375

The increase in the income is D' - D = 2.88 roubles/h, while the


increase in the cost of the upkeep is 2 roubles/h. It can be seen that the
transition from n = 2 to n = 3 is economically advantageous.
11.11. We consider. an elementary que'lleing system with a practi-
cally unlimited number of channels (n-+ oo ). The demands arrive with
intensity 'A while the intensity of the service process (per channel) is 1-L·
Find the limiting probabilities of states of the system and the average
-
number of busy channels k.
Solution. This que.ueing system is neither a congestion nor a delay
system, but it can be regarded as the limiting case for a system with
refusals for n-+ oo. Erlang's formulas (11.0.6) yield
co
"\1 pk } -1 "' pk -
Po= { L.J k! =e-P, where p=!l' Pn= kl e P=P(k, p)
k=O •

(see Appendix 1). For an infinite number of channels A = "A and k =


t.f~t = p.
11.12. We consider a single-channel congestion system to which
demands arrive in a stationary Poisson process with intensity "A. The
service time is exponential with parameter 1-L = 1/~er• The operating
channel can fail from time to time (refuse), the failure process being

(a)

-+-1----{@>--~~,.--{@ 1 )( X @>--4X~XE--~)-E-(.....,@-@~)(~-
0 t t
X -service process p
@ -refusal process v
{b)
Fig. 11.12

sta tionary Poisson's with intensity v. The reconditioning (repair) of


a c1lannel begins immediately after it fails, the time of repair T r being
exponential with parameter I' = 1/I;. The demand which was being
ser~ed at the moment of failure departs from the system unserved.
Fhd the limiting probabilities of the following states of the system:
~b' t e channel is idle, s1 , the channel is busy and in working order, s 2
A e channel is being repaired and the characteristics of the system are
and Q.
. Solu~ion. The directed graph of states of the queueing system is
gi\'en In Fig. 11.12a. The algebraic equations for the limiting proba-
376 Applfed Probltm• in Probabllzty Theory

btltttes of states are


f1P1 + 'YP2t (~t + v) P1 = i..po, ~P1 = 'VP2' (11 12 1}
APo =
they are summed '' 1th the norm 'l}tztng cond1tton
Po Pt + P2 = 1. + (1112 2)
\Ve el'pre~ tho prob'lhJlJtJes p 1 and p~ from (11 12 1) In terms of p11
A ~ AV
Pt = ~~+v Po' P2 = y P1 = 'Y (~"+v) Po
Subst1tut1ng p 1 and p 2 1nto (11 12 2), '' e get
Po = { 1 + )./(~L + \) + Av/[1' ((t v)l}-1 (11 12 3} +
To frnd the rclat1 vo capac1 ty for servtce Q, the probabtltty PD that
the demand 'v1ll be occepted for service must he mult1pl1ed by the cou·
diltonal probab1l1ty p that the demand ,vbtch lS accepted for serv1ce
\VIll actually he served (the channel 'v1ll not fa1l dur1ng the servtce
ttme} \Ve ~hall use the tntegrnJ total probabtltty fo-rmula to find the
condttlonal probab111ty \\ e ad' a nee a hypothcs1s that the serv1ce t1me
of a demand has fallen on the 1nterv1l from t to t ;-- dt, the probabd1ty
of th1s hypothesis he1ng approxiniately j (t) dt. 'vhere f (t) Is the dts
tr1button denstty of the servtce time,. 1 e J (t) ~ ~Lew-J.l.t (t > 0) The
cond1 t1onal proba btlt t y that the ch nnnel v. 1ll not fa 1l dur1ng the ttme t
1s e 'Y t hence
[.'1(]1 0::.

p = ) !1P ~le-VI dt = ) J!e-(P+'Y)I dt = ~~'V ,

0 0
Thts condlt1onai probahlllty can be found much more eas1Iy 1t 15
equal to the probabtl Lt y that once 1t hns begun, the .serv1ce procedure
'' 1ll be completed before the channel fa1ls. 'Ve super1mpose on theta 'tiS
(F.Jg 11 12b) the two processes, v1z the service process w1tb Jnten~Jty 11
(denoted by crosses) and the refusal procc~s \VIth 1ntens1ty v (denoted
by circles) \\ e f1x a potnt t on the t a~ 1S and find the probab1lt ty that
the first cross follo\v1ng the fr'\ed potnt w1ll arrtve earlter than a ctrcle
It ts ev1dently equal to the rat1o of the 1ntens1ty of the of croc;:ses no"
to the total 1ntens1ty of the floVts of crosses and Circles, ~L/(Jl v) +
Thus
Q--PoP ,. ---- ( J-1+
,... }
v
I (1 + p.+A + v
h'V
1' UL+v)
) ..-
-
p.
ll+v+A (i+"'IY) f

A=AQ. (11124)
J 1.1341 The cond.1t1ons of Problem 1 f 12 are repeated w.1th tbe only
ddlerence that the channel may fatl dur1ng an 1dle period too (\vith
tntens1ty -v' < v)
Solution The d1rected graph of states of the queue1ng system IS
shown In F1g 11 13 From the equations
(h + \'') Po = f!Pt + YP1' (!-1- +rv) P l = 'Ap ru ~P2 = "VPt + v' Po•
Pa + P1 + Pt = 1
Ch. 11. Th e Q ue ue in g Th eo ry 37 7

we find the li m it in g p ro b ab il it ie s
={i + tJl+'),. v + !vv+Jl!l+Vv'+vv' } -1 '
Po
'). lvv + JlV' + vv '
P t= Jl + v .; P o • P z= ~t+v Po•
A.Jl
Q = p o Jl ~ v , A = 'A Q = Po Jl + v .
si n g le -c h an n el q u eu ei n g sy st em
11.14. We co ns id er an el em en ta ry qu eu e, m = 2. T h e op er at in g-
li m it ed n u m b er of pl ac es in th e
with a

- SIO Szo S3o


Sao foE
ft fl ft •
J)
J) }' j) '}'
p
J_ ,. •
Stt Szt
}'
Fi g. 11.14
Fig. 11.13

d em an d be in g se rv ed at th e m o m en t
channel m ay fa il (refuse). T h e ar e u n o cc u p ie d pl ac es , an d if
er e
of failure joins th e w ai ti n g li n e if th . T he in te n si ty of th e ar ri v al
are no pl ac es it d ep ar ts u n se rv ed
there ~~, th a t of th e fa il u re pr oc es s-
es s is
process is A., th at of th e se rv ic e pr oc io n in g (r ep ai r) pr oc es s is y.
co n d it
Ef the channel is v, an d th a t of th eemre, find th ei r li m it in g p ro b ab il it ie s-
•n um er ate th e st at es of th e sy st
and determine A ' k ' ' ' tW ' tq forz
Sol ution. The st at es of th e sy st em are:
r 'A = 2 lL
' r-
= 1 '
'V = 0 •
5 an d "'
r
= 1•

e ch an n el is in w o rk in g or de r;
8
oo -t he sy st em is id le , th in g o rd er , th er e is no q u eu e;
5 o -t h e ch an ne l is b u sy an d in w o rk
1 n g re p ai re d ; on e d em an d is -
b ei
5
u -t h e ch an ne l h as fa il ed an d is
Waiting to be se rv ed ; r; on e d em an d is be in g-
in g or de
s2o-the ch an ne l is b u sy an d in w o rkbe se rv ed ;
I

I se rv ed an d an o th er on e is w
. s21-the ch an ne l h as fa il ed an d b is
ai ti nO ' to
b ei n g re p ai re d ; tw o d em an d s ar e-
In the queue; an d s ar e in
o rk in g or de r; tw o d em
th Sso-the ch an ne l is b u sy an d in w se rv ed .
e qu eu e an d on e d em an d is b ei n g
is sh o w n in F ig . 11 .1 4. T he -
Th~ graph of th e st at es of thbeabsyilstitem ie s of st at es ar e
equallons for th e li m it in g p ro

~lP1o = l•Poo, ~P2o + Y Pu +


l"P oo = (~ + 'A v) +
P1o•

~lPao + Y P 21 A +P1 o = (~ L 'A + + v) P2 o•

AP2o = (~ L + ·v) Ps o• V P2 o = (~ l + '\' ) Pao•

"P1o = (1' + 'A) P w 'A pn V+P2 o V +Pa o = Y P2 1•


'
' Poo P1 +0 P +u +
P2 o +P2 1 Pa +o = '1.
878 Applied Problems tn Prohabllltv Thtorv

Sol vt ng the~e equa t tons at ~ = 2 11 - 1 v = 0 5 and y ==== 1 we obta1n


p 00 = 3/61 ~ 0 049 p 10 = G/61 ~ 0 098 P20 ;::;: 14/61 :::::; 0 230
p 30 = 56/183 ~ 0 306 p 11 = 1/Gt ~ 0 016 p 3 t = 55/183 ~ 0 301
Hence
z = 1 + P t) + 2 (p + P + 3p = 383/183 ~ 2 09
(p1o 1 20 21 ) 3o

r = 1 (p + p + 2 {p + p ~ 89/61 ~ 1 46
20 11) 21 30 )

k = 1 (pte + Pto + Pao) - 116/183 ~ 0 63


The abc;tolute capac1t' A of a queue1ng system 'v1th nonrefus1ng
channels can be found by mult1ply1ng kby J.t In our case the producttv
1ty of one channel (the number of demands actually served per unrt
time) can be found by m ul t 1pl) tng k by the pro bab1l1 ty ~-tf (I-t v)
- +
that the servtce procedure'\\ 1ll be completed t e A - kt-t ~t/(~ v) - +
k)l21(J-L + -v) ~ 0 42
1 t 15 There are three cha1rs (n 3) 1n an outstation dental surgery
and three cbatrs tn the ' tttng room (m - 3) Pattents a:rrtve 10 a
statronary Potsson proce~s 1.t a rate of A - 12 patients per hour The
servtce ttmP (for one pat1ent) s exponential "1th a. mean value ta~r = -
20 mtn If all the three chntrs n the 'va1ttng room are occupted an
arr1 ving p'lt 1ent does not JD1n tl1o queue F1nd the average numbPr ol
pattents that the denttsts serve per hour the average fractton of the
arrlvtng pat1ents that are actuall} served the average numb~r of occu
pted cha1rs In the \\a t1ng room the average \Va 1t1ng time tw of a pat1ent
(Including both the dental surgery and 'va1t1ng room) and the a' erage
"'a1t1ng t1me g1 ven that the pat1ent 'vtll be served
Solution p 12/3 ~ 4 x = p/3 4/3 n = 3 m = 3 From for
mulas (11 0 26) (11 0 30) 've find that

P0 ={1+4+ 2
41
6
+ s-+
4
3X6
4
-<"413
t-4/3
11
~
>'}
1"Jo 0121s~o 012
,....,
Pt = 4 X 0 01218 :::::r 0 049 p 2 - 8 X 0 01218 ~ 0 097

Ps = 32 0
a X 01218 ~ 0 130 4'
p 3 +1 =as X 0 01218 ~ 0 17 3

Ps+2- {- P3+1 :::::: 0 231 P3 +s =-} P +z ~ 0 307


3

The averago fract1on of pattents betng served 1s Q = 1 - Rret =


i - Ps+s ~ 1 - 0 307 - 0 693
The average number of patients ser,ed by the dent1sts per hour lS
A AQ ~ 12 X 0 683 ~ 8 32
By formula (11 0 27) the average number of busy servers (denttsts)
-
k = 4 (1 - Pa+a) ~ 2 78
379
Ch. 11. T he Q ue ue in g T he or y

u m b e r o f p a ti e n ts in th e q u e u e
By formula (11.0.28) th e a v e ra g e n
""' "' 1 56
3 4
8 1 -4 X (4 /3 ) + 3 (4/3 )
- = 44X0.0121
r 3x6
2
(1 -4 /3 ) ;"o ../ • '

- - - -=- -
f w
-
= z / A . ~ 0 . 3 6 2 h.
tq r/11.~ 0 . 1 3 h ,
z=r+k~4.34,
a u se so m e p a ti e n ts d o n o t jo in
Th e va lues of 7w a n d tq a re sm a ll be c
a v e ra g e w a it in g ti m e
a rt u n se rv e d . T h e c o n d it io n a lf
the queue an d d e p ~

a t h e is se rv ed, is fw = t w ! Q ~ 0 .5 2 h , a n d th e
of a p at ie n t, p ro v id e d th ~

g ti m e (u nde r th e sa m e c o n d it io n ) tq =
~onditio na l av er ag e q u e u e in
tq!Q ~ 0.19 h. .2 6) a n d (1 1 .0 .2 8 ) y ie ld a n in d e te r-
11 .16. F o r x = 1 fo rm u la s (1 1 .0 and
v a lu e o f th e in d e te rm in a te fo rm
m in ac y of th e fo rm 0 /0 . F in d th e '

s w h ic h a re v a li d fo r x = 1.
'OI'l'ite form u la
Sol ut io n. B y L 'H o sp it a l' s ru le
. 1 -x m -m x m -1
hm = -x
=m,
Y. -+ 1
1-x
f pn pn+lm } - 1 (11.16.1)
P o = l 1 + +···+
p
11 n! + n ·n l '

(11.16.2)
Ph= ~~ Po (1~k~n),
pn (11.16.3)
P n+ i = P n+ 2 = •· · = P n+ m = nl Po•

it h Pn a n d e n d in g w it h P n+ m•
all th e p ro b a b il it ie s, b e g in n in g w
i.e.
are equal.
d k re m a in th e sa m e .
T~e ~ormulas fo r A , Q, P re f a n
in a te fo rm in fo rm u la (1 1 .0 .2 8 ),
dm g th e v a lu e o f th e in d e te rm
Fm
we get
= pn +lrpo m (m + il . (1 1 .1 6 .4)
lim 1 -( m + il x m + m x m + l - m (m + 1 ) 2 n ·n l
2 '
11~1 (1 -x )z
n d (1 1 .0 .3 0 ) re m a in th e sa m e.
Formulas (11.0.29) a 6 .1 )- (11 .1 6 .3 ) w it h o u t fi n d in g th e
. We co u ld d er iv e fo rm u la s (1 1 .1 h e m e.
b u tu si n g th e b ir th a n d d e a th sc
of th e in d e te rm in a te fo rm
'alu e
fi ci en cy ch a ra c te ri st ic s A , Q , P re t• k, T ,
_11.1 7 . (1 ) C a lc u la te th e ef
- - e r sy st e m w it h th re e p la ce s in
"
e le m e n ta ry si n g le -s e rv
~, tq , tw fo r a n
us t o ~ e r s ~e r r = 1/f.L =
6he queu~ (m = 3) g iv e n '). , = a4ra ccte
h o u r a n d 't; ;e
ic s w 1l l c h a n g e w h e n th e n u m ber
ri st
£5. (2) F m d o u t h o w th e se c h m = 4 .
0 q u e u e is in c re a se d to
places in th e B y fo rm u la s (1 1 .0 .1 2 )- (1 1 .0 .1 6 )
f Solutio n . f.L = 2 , p = A./f.L = 2. Q ~
l = ·1 6/31 , Q ~ 0 .4 8 4 , A = 'A
o 1 /3 1 , p
or m = 3 w e h a v e P = 4
~ 0 .9 6 8 , r::::::: 2 .1 9 c u st o m e rs , z ~
1.93 cu st o m er s p er h o u r, k = pQ
3.1G customers, ~ ~ 0.55 h a n d fw :::::::0.79 h .
380 Appll~d Problems fn Probabilltv Th~ljry

(2) For m = 4 '' e have Po = 1 63 ~ 0 0158 p 5 = 32/63 ~ 0 507


Q ::::;: 0 493 ,t ~ 1 96 customers per hour ~ 3 11 customers r ~ z ~
4 09 customers tq ~ 0 78 h and tw ~ 1 02 h ·
Thus an 1ncreac.e In the number m of places from three to four leads ~
to a neg l1gible 1ncre'lso 10 the nbsol u te (and rela t1' e) ca pac1 ty of the '
system for servtce It doe$ tncrea~e some~hat the average number of 1
customers 1n the queue '\nd 1n the S}stem and the correc;pondtng a'erage
t1mes Thts ts natural since some of the customers who \\ ould he refu~ed ~
1n the first var1ant JOin the queue tn the second :a

11 18 Ilo\v 'vtll the cfftctenc)' cbar acterts tics of the system 1n the 1
prccedtng problem change tf A nnd p. rematn the ~arne, m = 3 but the
number of ~ervers 1ncrea.se to n ~ 2?
Solution x =
1 from formulas (11 16 1) and (11 16 2) \Ve have
Po - 1/11 P 1 - Ps = 2/11 Q - 1 - 2/11 ~ 0 818 A ~
3 27 customers/h r-
12/tt ~ 1 09 customers f = Al~t ~ 1 64 i""-
r +h. ~2 73 cu~ton1crs tq ~ 0 27 h nnd tw ~ 0 68 h
11 19 The queueing system 1s a ra1l \vay book1ng of flee "1th one
'vtndo\v (n t) and an unbounded queue The man 1n the 'v1ndo\\ sells
t1ch.ets to potnts 1 nnd B On the a\ erage three passengers e'\ery
20 minutes arrr\ c to bu:-;. '"\ ttckct to po1nt A and t\\ o p1s~engers every
20 mtnutes arr1' e to buy a ttcket to B The arrt\ al process can be con
stdered to bo stationary Potsson s On the a' erage three passengers are
served 1n 10 mtn The "erv1ce t1me IS e""<ponent1al F1nd 'vhether hmtt
1ng probablltttes of stater;; of the S) stem e"{Iq,t o.nd tf they do calculate
the first three Vtz Po p p 1 Ftnd the eff ctcncy characteristiCS of the
- - ....... -
system z r tw and tq
Solut•on /t.A - 3 20 0 1~ cu~tomers/mtn ha ~ 2/20 = 0 10 cus
tomers/mtn The tot '11 'l.rrl\ al 1ntens1ty A- A,.A + A. s = 0 25 custom
ers/m1n ~l 3/tO ~ 0 3 customers/min and p = Al(l ~ 0 833 < 1
Limiltng probabtllttes do exiSt By formulas (11 0 12) (11 0 14} Po ~
0 167 p1 ::::::; 0 139 p2 ;::;::::: 0 116 _ z R: ~ ~~~ ~ 4 99 customers : =
0 833 /0 167 ~ 4 16 customers tw ~ 4 99/0 25 ::::;;;! 20 0 mtn and tq ~
4 16/0 25 ::::::: 16 7 mln
11 20 A single channel queue1ng system rs n computer wh1ch rece1ves
calculation JObs The Jobs arrive 1n a stationary Potsson process the
1nte.rarr1val time betng t = 10 mtn The ~ervtce t1mo Tser has an Erlang
distrJbUtJon oi the thJrd order w1t.b expectatJon ~er --- 8 mJn F1n2
the average number z of JObs 1n the system and the a\era.ge number r
of JObS ln the queue and also the average \\8ltlngttme tw and queUelDg
ttme -tq for a job
Sol ut1on ' ' e can .find tho cbaractertstics of the S} stem from the
Pollaczeh l(hinchlnc formulas (11 0 31) and (11 0 32) 'Ve have ~ ;:=;:
0 1 JObslmtn t-t - 0 125 Johs/mln and p - A./~ = 0 8
The coeffic1en t of var1at1on of the servtce t tme for Erl ang s dtstti
v
butlon of the thtrd order IS 1} 3 By formula (11 0 31) = 0 64 (1 r +
Ch. 11. Th e Qu eu ei ng Th eo ry 381

113)/(2 X 0.2) ~ 2.13. B y fo rm ul a {1 1. 0. 32 ) z r+


= 0. 8 ~ 2. 93. B y
Little's formul a Tq ~ 21 .3 m in an d f..v ~ 29 .3 m in .
of th e pr ec ed in g pr ob le m is ch an ge d: jo bs
11.21. The hy po th es is ry Po is so n flo w . T he
no t in a st at io na
arrive now in a Pa lm flow an d ti on of th e se co nd
ra liz ed E rl an g di st ri bu
in!erarrival tim e ha s a ge ne 1/ 2 an d 1. = 1/ 8.
38 ) w it h pa ra m et er s 1. =
order (see Pr ob le m 8. 1
io ns of th e ef
2
fi ci en cy
37 ), fin d ap pr ox im at
Using formulas (1 1. 0. 34 )- (1 1. 0.
characteristics of th e sy st em . ra liz ed E rl an g di s-
w hi ch ha s a ge ne
Solution. A ra nd om va ri ab le T, om va ri ab le s T
nd or de r, is th e su m of tw o ra nd 1
tribution of th e se co ra m et er s 1. = 1/ 2
ia l di st ri bu ti on s w it h pa
and T2 which ha ve ex po ne nt 1
[T ] =
and A2 = 1/8. H en ce M [T ] = 1/1. 1 +
1/1. 2 = 10 m in , V ar
2 = 68 , v~ = 68/10 2 = 0. 68 an d v~ =
Var [T1] + V ar [T 2 ] = 2 2 8 +
y3. r
Consequently = 1. 62 jo bs , = z 2. 42 jo bs , fq = 16 .2 m in an d ·

lw = 24.2 m in . Th e re fu sa ls oc cu r
fr om ti m e to ti m e.
. 11.22. A device ca n fa il {refuse) 1. 6 re fu sa ls pe r da y.
ma stationary Po is so n pr oc es s w it h in te ns it y I. =

Fi g. 11.22 Fi g. 11.23

ai r) T ha s a un if or m di s-
T~e t~me needed fo r re co nd iti on in g (r ep
Fi nd
rep
(f or li m it in g st ea dy -
tnbutlOn on th e in te rv al fr om 0 to 1 da y.
er ag e fr ac tio n R of ti m e fo r w hi ch th e de vi ce is
state conditions) th e av
operating. g
e de vi ce ar e: s th e de vi ce is op er at in
Solution. Th e st at es of th 0 ,
st at es of th e de vi ce
re ct ed gr ap h of
~nd sl, it is be in g re pa ir ed . Th e di
= 0. 5 = 2. T hi s
18
shown in Fi g. 11 .2 2, w he re fL = 1/ M [T re pl 1/
-
di re ct ed gr ap h of st at es of a si ng le
ghaph exactly co in ci de s w it h th e th e ar ri va l pr oc es s is
W e kn ow th at if
c a~nel congestion sy st em . bi tr ar y di st ri bu ti on ,
ha s an ar
~tat10nary Po is so n' s an d th e se rv ic e ti m e
is ca se p = 1./fL = 0. 8,
en Erlang's fo rm ul as (11.0.6) ar e va lid . In th
6 ~ 0. 44 4.
~o = {1 + p/1 !} - = 1/ 1. 8
1 ~ 0. 55 6 an d p 1 = 1 - 0. 55
or e th an a ha lf
fh~ s R. ~ 0.556, i.e . th e de vi ce w il l op er at e
be
a
un
li tt
de
le
r
m
re pa ir .
0
t le hme an d th e re st of th e ti m e it w il l
th e pr ec ed in g pr ob le m th e de vi ce ha s
11.23. U nd er th e co nd iti on s of er at in g. T he pa ra m et er s
~n exact re pe at er w hi ch ca n fa il on ly w he n op
m 11 .2 2. Fi nd th e va lu e of R an d th e
·and fl. are th e sa m e as in Pr ob le
averabae • num b er k- of th e fa ul ty devi.Ces.
bo th de vi ce s ar e so un d
(o o~uhon. Th e st at es of th e
8 sy st em S ar e: s 0 ,
on e de vi ce is op er at in g an d
thne Is op~rating an d th e ot he r is id le ); s 1 ,
de r re pa ir . Th e di re ct ed
bo th de vi ce s ar e un
gr e other Is un de r re pa ir ; s 2 ,
ex ac tl y co in ci de s w it h
.2 3. T he gr ap h
aph of st at es is sh ow n in Fi g. 11
382 A pplled Problems Jn Proba bllltp Thtorv

the graph of st~tes of a t,,o cbannelsy.stem w1tb refusals From Erlang;


formulas (11 0 6)
Pa={1 +~~}-t={i+OS+OG4/2)-l~0472,
+ fJ
p 1 ~ 0 8 X 0 472 = 0 378, R =Po + p 1 ~ 0 850,
p~ = 0 150 k ~ 1 p 1 + 2 P 2 = 0 728
The sa me t echn 1que (reduct 1on to a Sl stem 'VI th refusals) can ev1
dently be used '"'hen tho numoor of .repeaters JS more than one
t 1 24 1n a "C'ry I argc boolsh op t1 e ~ecur1 t y s ystcm lS c:uch tha 1
havtng clto.::en a book tn 1 dep1"rtment 'l customer tales .rt to the as('tstan;
'' ho \\rttes out an In\ o1ce Thr- cu~tomer must then take tbe 1nvo•c'
to the cash des.k and pay for the ~n' 01cc Having paid he can then returt
to the c:.ell1ng department ,vhere the a~s1stant ehccl s the 1n' o1ce anc
''raps the book so that the fi:ccurtty st.aff kno\v the boo h. IS patd for
The customer must therefore go through three serv1cc phases (1) col
lectton of tnvo1ce (2) pa"l ment (3) book \vrapp1ng Assume that eusto
mcrs arr1' e at a department 1n a Po1sson flo'' 'v1th ;., = 45 customer:
per hour
There "l.re four 'lsststants to 'vr1to out the tnv-oices and the averagE
-
service time t 1 1s ftve m1nutes The average serVICe time at the cashter t:
.-..

{a.~sume for s1m pllcity there 1s one who only serves the one departmen1
and no others) IS one m1nute per customer Three ass1stants cheek tb{
1nvo1ce and \Vtap the books the av~rage c:erv1ce t1met! be1ng t'vo m1n
utes per customer Ac:sume that the flo,vs are stat1onary Poisson s and
tl at the syst~m 1s a three pho.se queue1ng system F1nd 1ts efftctenc~
character ts t 1cs
T;_ (;; ;;)~the average number of customers 1n the queue 1n the
.first (.second and thlrd) c:crvico phase
Z'; (z~ z-;)-the average number of customers 1n the fi.rst (the second1
the th 1rd J ph a ~e
t J (t J tj )-the a'\ erage queue1ng ttme of a customer in the first
(the second the tbtrd) phase
tJ [i~ , i;,u }-the ,a, erago \vaJting t1me of a customer Jn the- first
(the second, the thtrd) ph a ~c
-
r-the overall average number of customers tn all three queues
-z-the total average numbe:t of customers 1n the shop,
-
tq-the total average queueing t1me of n customer
-
tw--the total average W11t1ng time of a customer
A:o..~w.~J:. t..lut t~u. .~'lt'..'2,~ !l.i.i.\t,}Rn.~~\ ?.t_'2re.l\J..~~ \\\ l 'A'.~ ~~~}it~ ~}1.~ and
how should the ~ervtco he 1mproved to reduce the watttng ttme of a
customer? (2) How can the fact that some customers havtng got an
1nvo1ce find they do not have enough money to pay (and so leave the
system before the cash1er queue) be taken 1nto account? The fract1on of
such customers IS ct (0 < a. < 1)
Ch. 11. The Queueing .Theory 383
Solution. Since all the processes are stationary Poisson's, the depar-
ture processes from all the three phases are also stationary Poisson's,
and three successive phases can be regarded as three individual queueing
systems with their own characteristics.
1. First phase. Since there are four assistants, the number of servers
n1 = 4. Furthermore, t 1 = 1/fl1 = 5 min = 1/12 h, p1 = 45/12 =
1514 ~ 3. 75 and x1 = p1 /n1 = 15/16 < 1. From formulas (11.0.21)-
(11.0.25) we find that
p~~, = { 1 +3. 75 + (3.;5)2 + (3;:;'3 + ~. ~~r
+2·3·4·~·(~ ~15/16) }-
5 1
~ (151.58t ~ 0.0066, ~ =
1
3.75.
-
r1 = (3. 75) 5 Pb1' ~ 13.01, z- 1 = 16. 76,
4·41 (1-x1 ) 2
~ 1 = ~//., ~ 0.289 h ~ 17.3 min, tW =~/'A~ 0.372 h ~ 22.3 min.
2. Second phase. ')., = 45, n 2 = 1 and p 2 = 0. 75 < 1. From formulas
(11.0.12)-(11.0.14) we find that
Tz= p;/(1-p2) = 9/4= 2.25, ~ = p2/(1- Pz) = 3,
tlrll=r2/fl=0.05 h=3 min, ~\P= 1/15 h=4 min.
3. Third phase. n 3 'A = 45, p3 = 3/2 and x 3 = 0.5
= 3, < 1. From
formulas (11.0.21)-(11.0.25) we find that
p~3 ' ~ 0.210, ra ~ 0.237 and z; ~ 1. 737,
tlf' ~ 0.316 min and ~> ~ 2.316 min.
Adding together the average numbers of customers in each queuet
~·e obtain the total average number of customers in the queues, i.e.

r=~+rz+Ts~ 15.5.
_B~~ analogy we find the average number of customers in the system
+z z
'== zl 2 + 3 ~ 21. 5.
The average queueing time per customer
tq = t~' +t~> +t~':;:::;; 20.6 min.
The average waiting time per customer
tw=t#'+tW+tW~28.6 min.
{1) The service can be improved by decreasing the waiting time of
~~usto~er in the first phase, which is the weakest phase in the system.
he easiest \\'ay to do this is to increase the number of assistants, i.e.
e number of servers in the first phase. For instance, a simple unit
~c~asc i~ the number of assistants (i.e. a transition from n1 = 4 to
1
- 5) Ywlds an essential gain in time. Indeed, for n 1 = 5 we obtain
384 Applted Pr<JbltrnJ In Probabilttg Thtory

1or the first phase p1 = 3 75, x.1 = 3 75/5 = 0 75, p(~) = {59 71} .d
T, :::::: 2 08, i; ~ 5 13, t~J) ~ f 84 mrn and t.Jl ~ 6 8.4 m1n
(2) \Vc can take 1nto account the fract1on of customers ct leavtng t~
shop " Jthou t a putchaso hJ m ul tJ pl yJng the 1n ten~~ ty ol the arr1"
proces~ of the c;:econd and thtrd phases by (t - a.) 1
1 t 25 Trains arrl ve at a ra1l \vay ~hunting yard 1n an Erlang procr 1
.of order 10 \\ 1t h 1ntens1t} A. :=::=: 1 2 tra1ns per hour•> The servtce tnt~
of a train T,e 1 IS dtstrtbuted o'er an Interval frc 1
0 to 1 b \vtth denstly rp (t), shown 1n F1g 11 :
Us1ng formulas (11 0 34) (t 1 0 37), a ppro"trmf ·
2
the efficJenc} cltaractcrtStlc..~ of the lard, 1 e t ~
- .....
a' erage n nm her : of tra tns at the yard and 1n t ...
- - -!"f'
queue r the nver,go "a1 t1ng t1me lw of a tr1 ·
--..,;...~""""""'".:;...;;,..;.....__ t
-
and the a' eragc queue 1ng t 1m e lq of a tratn "
n ' Solution 1- or the dlstrtbution f.P (t) we h1~
F1g II 25 t,er = t/3 and p - i.-l~t = 0 8 < 1 For a 10
order Lrlang proc~ss vl = (itVk)~ = 0 it vJ <!
!Je found b} dt" 1d1ng the 'ar1ance V flr I Tserl by the squ1re of \
mean 'nlue "~ = 1/2 == 0 5
\Ve ftnd the charactertsttcs of the queue1ng S) stem 1
- r = p2 (v~ + v~)/(2 (1 - p )1 = 0 82 (0 1 + 0 5)/(2 0 2) = 0 96, •
z = r + r : : : 0 96 + 0 8 = 1 76,
Tq = rlh ~ 0 8 h, tw = tq + 1/J.t ~ 113 b
11.26,. Sho'\ that for an elementary n server queuetng system 'v
.an unltm1tcd number of places tn the queue the a\erage numberL "-.I
~ustomers 1n the queue IS
xtl+l n _ _,n+t
1-x Kl{n-t) + 1 <r< 1--x •
Solu hon
{11 0 23)
'V te the
e \\ rl
and ( 11 0 21)]
€.1. press ton for r Jn the fo llo\VJng form f
'
r= p'U-J
n nl
Po
{1-x)' -
- ~
(1-x)2 Pn-.
p"
where Pn ~ nl Po
Consequen ti ~
Pn = ~~ {1 + p + ~~ + +
p"
nl +
pn ~
nl 1-x
) -1

= (1 + ~+
p
n (n-1)
p:~
+ +~n!
p 1.
+ x
t-x
J-t
> ( f + ..!:.._
p
+ n
p"~
1
+ ~ ~
+ _nn
P"'
+ 1-x
X ) -I ~ X.n ( 1 _ X)

•) Note that A. here 1s the 1ntens1tl of Erlang '- tlo'v and not that of the i
~bich "as thnmed out tfJ get the Erlang no"
Ch. 11. The Queueing Theory 385

~ the other hand


_!_J.. n nn-l X )-1 __ xn(1-x)n
+
Pn < ( 1 p ' p2 + ···+ pn + 1-x - +
xn (n -1) 1 ·

1ce [xn (1- x) n]/[xn (n- 1) + 1] < Pn < xn (1 - x), the in-
lily indicated in the problem is satisfied too. Note that the last
1ality can be used to approximate all the characteristics of the
eing system in question .
.27. A railway booking-office has two windows in each of which
'ls are sold to two destinations, Leningrad and Kiev. The arrival
esses of the passengers who want to buy tickets to Leningrad and
~hm the same intensity A- 0 = 0.45 pass/min. The average service
-
~per passenger (selling a ticket to him) tser = 2 min. •
rationalization proposal has been made, i.e. that in order to de-
se the queues (in the interests of the passengers), the two booking-
es should specialize, one to sell tickets only to Leningrad and the
!r only to Kiev. If we consider, as the first approximation, all
arrival processes to be stationary Poisson's verify whether the
posal is reasonable.
~lntion. (1) Let us calculate the characteristics of the queue for
"o-server queueing system (the existing variant). The intensity of
arrival process').,= 2A. 0 = 0.9 pass/min, It= 1/~er = 0.5 pass/min,
~AI~t == 1.8 and x = p/2 = 0.9 < 1, thus limiting probabilities
l\. From formula (11.0.21) we have

1-
1
~0.0525,
2 1
Po={1+1.8+ 1.S
2 1- 09
. J
1from formula (11.0.23)

-_ 1.8 3 ·0.0525 7. 7 .
r- 2. 2 . _ ~7.7 passengers, tq= _ ~8.56 mm.
0 01 09
t~l In the second (suggested) variant we have two single-server queue-
·t systems with p = A, 0 f~t = 0.45/0.5 = 0.9 < 1.
_r;ormula (11.0.13), the average length of the gueue at a window
-p·/(l- p) = 0.92/0.1 = 8.1 pass, and thus the total length of
rue at the two windows 2r = 16.2 pass. - -
··~\q~eueing_time of a passenger, according to (11.0.14), tq ~ ri'A_ =
in th- 18. ~lll, and this is almost twice as long a queuemg t1me
•JJnclns·
e BXIS hng Vanan
• t , v1z.
· 9 . 3 mm.· . . .t
. JOn: the "rationalizati-on" proposal must be reJected smce 1
~ llcaUv d· · ·
!elfi · • lmJmshes the efficiency of the system. The re uc Ion d t ·
°f
-:e1 ~~~ncy of the system caused by going from a two-server sy~tem
\a~ ng v~ri~nt) to two single-server systems (the pro~os~d varrant)
'i1nts eth division of the booking-office into two spec1ahzed offices
11.28* e men at the windows to repeat each othe: · ". . ,
-' "'er. An elementary multi-server queueing system wLth LmpatLent
,.,._ s and an unbounded queue. \Ve have an elementary
'. ' ~
386 Applud Probl~mt tn Pral!alll!Ug Taeorfl_

n ~er\ er queueing system, the 1nten~:aty of the arr1' al process ts A, and


the 1ntens1ty of the s~rvJce process tSJJl,. ;.· 1/~tr The queuetng tlme
of a customer lS J1m1ted by a certain random l1me T 'tv. b1ch has an expo
n en t 1a1 d 1st r1 but ton 'Vl th parameter 'V ( e \ ery customer 1n the queue
rna) depart an(l the departure process bas an 1ntens1ty v)
\Vr1te the formulns for the ltmit1ng probabtlttles of states, find the
rclat1 "e capaCJtj of the system for c:ervi.ce Q. tbe average length of the
rt
queue the 3\eragc qucUelng time fq of a CUStomert the average num
z
her o£ customer~ 1n the system and the average "atttng ttme tw per
customer
Solut•on As bcfoie, we enumerate the states of the ~)stem 1n accor--
dance ~ tth the number of customers Ill tbe system The d1reeted graph
--~ .l l l J l
~~~ rt:-.._: :[sn rj ..=• ..
J
[s~):P-~ ).;1 S2 ~ • ••: ~ sk J;: •~ ·{ Sn .J
Sr 4 ..

2p Jp. kp np np, ~ r1)l ,.~

of states sho,vn 1n Ftg 11 28 Us1ng the general huth and death


JS
formulas and Introducing the designations p = A/~L ancl ~ v/p, =
·ne obtaln
p P'
Po~ { 1 + 11 + 2J +
+
- p
Pt-- 11 Po

' Pn+~
~ p" P'"
- nl (n+ ~)(n+2~) (n+r~) Po (r;:o.t), (H 28 2)
Tb.e first formula (11 28 1) Includes an 1nfin1te sum, whtch 1s not
a geometric progre~~Ion but whose terms decrease faster than the
terms of d geomctr1c progre~ston ''e can prove that the error due to
the t1 uncat1on of the sum to the first r - 1 term~ JS smaller than
~ {PIP )r e PI~
nl rl
\\ e ac.~ume that \ve ha' e calculated the probnbtltttes p 0 , Pt'
,- A-., ' P'n + r' (l.w ,v ~ .ll"\J'\V $lh.r-.V Mv 5.\.s . .f~rul t,1~ ~:luv·..ar
teriStics of the queue1ng system the relative capac1ty for ser\ 1ce Q,
the average number of customers 1n the queue rand others \Ve shaH
f1rst f1nd Q All the customers '' tll be served except for those who will
depart ahead of ttme Let us calculate the average number of customers
'' ho depart un~erverl p~r unit tJmo The 1ntens1ty of the departure
Ch. 11. Th e Q ue ue in g .Theory 387

an d th e to ta l av er ag e in te ns it y
process per cu st om er in th e qu eu e is v vr. H en ce
th e cu st om er s in th e qu eu e is
of the departure pr oc es s pe r al l
solute ca pa ci ty of th e sy st em fo r se rv ic e
the ab
vr
- , (1 1. 28 .2 )
A = ') .. .-

and the re la ti ve ca pa ci ty fo r se rv ic e
... (1 1. 28 .3 )
o = Alf... = 1 - -WJf.
Thus, to find Q w e m us t fi rs t of al l kn ow r, w hi ch w
+
e co ul d fi nd
+ . · ..
di -

rectly from th e fo rm ul a r = 1P n+ t +
2P n+ 2 + .. . rp n+ r
be r
is th at it co nt ai ns an in fi ni te nu m
But the dr aw ba ck of th is fo rm ul a ex pr es si on fo r th e av er ag e
g th e
of terms. T hi s ca n be av oi de d b y us in g (1 1. 28 .2 )
s of A , i. e. k = A lfL or , ta k in
number of bu sy se rv er s k in te rm
into account,
k = (f .. .- vr )lf L = p - ~r . (1 1. 28 .4 )

From (11.28.4) we ge t
(1 1. 28 .5 )
-r = <P- k)J~.
er s k ca n be ca lc ul at ed as th e m ea n
and the average nu m be r of bu sy se rv r of bu sy se rv er s) w it h th e
K (t he nu m be
value of th e ra nd om va ri ab le ec ti ve pr ob ab il it ie s Po r
th e re sp
possible values 0, 1, 2, . . . , n an d
P1, P2, • • ·, Pn- u [1 - (p+ + .. · +
o P t P n- t) h
-k = 1p 2p
+
1 2 + ... + +
(n - 1) Pn-t n [1 - (po +Pt
(1 1. 28 .6 )
+ ••· + Pn-1)].

_Furthermore, we ca n ca lc ul at e b r
y fo rm ul a (1 1. 28 .5 ). T he q u an ti ty

tq can be fo un d b y L it tl e' s fo rm ul
a:
"tq = rJf.... (1 1. 28 .7 )

erage nu m be r of cu st om er s in th e sy st em
The av
- - {11.28.8)
z= r +
k,
~

ag e w ai ti ng ti m e pe r cu st om er
and the aver
(1 1. 28 .9 )
rw
= zff....
ei ng sy st em w it h "i m pa tie nt " cu stomer
Th is is con~
th e qu eu
we Remark. We can pr ov e th at for ab il it ie s al w ay s ex is t w he n P > 0.
finnhave considered, li m it in g pr ob th e first formula (11.28.1) converges for an y
po<~? by the fact th at th e series in e, th at a qu eu e ca nn ot in cr ea se in de fi ni tely:
tb~\ tve P and ~- Th is m ea ns , in essencte ns it y of th e de pa rt ur e process of th e cust(}.
tncrs~nger the queue, th e hi gh er th e in

ng sy st em w it h "i mp a~
fi ~;,29. A n el em en ta ry tw
1 o-serv er qu eu ei
11 .2 8) . T he in te n si ty of th e ar ri va l
en customers (see P ro bl em er taer =
th e av er ag e se rv ic e ti m e of on e cu st om
nocess '). = 3 cu st /h , hi ch a cu st om er w ai ts fol'
fo r w
ll === 1 h an d th e av er ag e ti m e
25•

-- - --
388 Appli~d Problems in Probability Thttory

serv tee "p 1 t u~ n tl' ' 1s 0 5 h Cal cui ate the l1m t t1 ng proba h111 t1es of
statest rest.r1cttng the calculattons by the probab1ltttes wh1ch are no
smaller th 'ln 0 001 r 1nd the character 1st 1cs of the efficte nc} of the
--
........ ...... ...........
system, 1 e Q, ,~l t k, r, tq, tw
Sol ut1on. \\ e b a' e A == 3, f-1 === 1 , v = 2, p = 3, ~ = 2, n = 2
From the formulas \\e der1ved for Problem 11 28 we get
3~ 3:1 [ 3 3' 33 34
=
Po { f + 3 + 2 -L 2 T + 4 6 + 4 6 B + 4 6 8 10
3 5o
..... 4 6 8 10 1l
3e
+ 4 6 8 10 il 14 +
3'
4 6 8 10 12 14 16
J}- ~ 0 0692 •
t

\\hence
3 3 I.
p 1 =3p 0 ~0208 p2 ~TPt~0311, p 3 = 4 p 2 ~023ij,
3 3
P~o=i- p3 ~ 0 117, p~ =-g p" ~ 0 014,. p8 = iO p 5 ~ 0 013,
3 3
P1 = 12 Ps ~ 0 003 Ps = t4 P7 ~ 0 001.

The a' erage number of busy ser' er~ ( 11 28 6) k = 1p1 + 2 (1 -


PIJ - p 1 ) ~ 1 654 the n\erage c;uze o! the qu~ue from (11 28 5) r = -
(p -h)/~ = (3 - 1 65lt)/2 ~ 0 673. the absolute capac1ty for service
A = AJ.-L ~ 1 654 cust/hl the relatt\e capac1t). for ~crv1ce Q = Al'h ~
0 551 and
0 776 b
tq z r
=riA- ~ 0 224 h, ~ + k ~ 2 327 and tw = zi'A ~
11 30. An elementary queue~ng system tvtth errors" Customers arr1' e
at an n server svstem 'v1th an unbounded queue tn a stat1onary Potsson
process 'v1th tntcnstty "A the serv1ce t1me 1s cxponent1al w1th parame
ter Jl Betng "er" e d IS not a gu 1.ran tea of sa t1sfac t1on, and the server may
satts(y the demands of a customer ,vtth probabtl1ty p or not sattsfy
them 'v1th probab1l1ty g == 1 - p In the latter case the customer
returns to the system e1ther being served 1mmed1ately, 1f there 1s no
queue; or JOlntng the queuo Define the states of the system (numbertng
them tn accordance '' 1th the number of customers present 1n the sys
tem)t find tho llmtttng probalnll t1es of states and the effictency charac-
teriStlcS of the sys tern and ftn d the n verage number of reset\ 1ce cla1ms
per untt ttme lf e\ ery unsatisfied customer may cla1m for reser\ 1ce
Wlth probab1ltty R
Solutlon~ The states of the queue1ng system are
So-the S} stem IS Jdle.
s1-one server 1s busy.. ,
s"-k servers are busy (1 ~ k ~ n),
sn -.ail nt servers are .busy,
.
' there 1s no queue
sn+r-all n servers are bus), r customers are tn the queue (r ~ 1'
2, ) .,.,
The d1rectod graph of states 1s sho,vn 111 F1g 11 30, 'vhere 1-1 = Pl-l
It can be .seen !rom thJs graph that the system 1.s equ1valent to another
Ch. 11. The Qneue ing Theor y 389

system with a comp lete guara ntee of satisf actio n but with a servic e
~ ~

inten sity of 1-l = Pl.l; for this syste m p = 'J../1-l = 'J../(p~t). Form ulas
-
(1'1.0.21)-(11.0.25) rema in valid only when p is subst ituted for p and
-
- for
1-L 1-l·

....----. A. A- it A, it J 2 -1
I So l~r-1 s, I.-:- e .. ._~! SkI<£::,.. .-:1 Sn ~~ • •• ._ :Jswrj __~..,
jf 2jf kp (ktf}p np np np np.

Fig. 11.30

11.31. An eleme ntary single-server closed queue ing system . One work er
serves m mach ine-to ols which fail from time to time (requ ire reset-
ting). The inten sity of the failur e proce ss of one mach ine-to ol is 'J...
If the worke r is idle when a tool fails, he imme diate ly begin s to reset
it. If he is not free, the tool joins the queue to be reset. The failur e pro-
cess of the mach ine-to ol is statio nary Poiss on's, the time neede d for
-
reset is expon entia l with param eter ~L = 1/tser· Defin e the states of

ml
ISo I~ ~ I s,
p

Fig. 11.31

the syste m, enum eratin g them in accor dance with the numb er of faulty
tools, find the limit ing proba biliti es of states of the syste m and the
efficiency chara cteris tics: A, the avera ge numb er of tools set up by the
worke r per unit time, w,
the avera ge numb er of fault y tools, the r,
avera ge numb er oi tools in the queue , and Pbusv• the proba bility that
the work er will be busy. ·
Solut ion. The states of the syste m are
so-al l the mach ine-to ols are in work ing order (the work er is idle);
s1-on e tool is faulty (the work er is settin g it up);
S~t-k tools are faulty , one is being set up, k - 1 are in the queue ;
sm-al~ m tools are faulty , one is being set up, m - 1 are in the queue .
The. dne~ted gra~h of states is show n in Fig. 11.3'1.
Des1g natmg p = t.!~L, we get the follow ing from the gener al birth
and death formu las:
Po={ 1+m p+m (m-1 )p:l+ ... +m( m-1 ) ... (m-k +i)p n
..1.. + rn 'r"}-1
I • • • • '

Pt = mppo, P2 = m (m -1) p2 p 0 , ••• ,


Ph= m(m -1) ... (m-k+1)php 0 (1~k:s;;;n), . . .'
Pm = mlprnPo· (11.31.1)
'
300 Applied Problems Jn ProhJJbiltty The Dry
-
In order to ftnd the absolute capac1t} A, "'e ftrst find the probabiltty
tba t the "orh.er 1s busy
(11 31 2)
If the " orker IS bus} he resets up ~t tools per un1 t t1me hence
A = (1 -Po) J-t (t 1 31 3)
\Ve can express the average number of faulty tools 1n terms of A as w
follo"s Every opernttng mnchtne tool generates a fa1lure process w1th
1ntens1ty A"' 1th m - wtools operatlng on the al-e.rage
- The fa1lure pro
cess the machines generate bas an 1ntens1ty (m - w) and all the faults
are eltm tnated by the worker Thts means th 1 t (1 - p 0 ) J1 :=::: (m - w) A
whence
w= m - (1 - Po)/p (11 31 4)
The average number r of maeh1ne tools tn the queue can .be found as
followS'
--- - - {t t 31 5)
where k IS the a'\erage number of tools betng ~erved (or the average
number of busy Qervers) In th1s case the number of busy servers 1s zero
tf the 'vorker IS 1dle and un1ty 1f he 1s busy hence k = 0 Po + 1 (1 -
Pol = 1 - Po Consequently
-
T ;;::; m- (1 - Po)/p - (1 - Po) or r= m - (1 - Po) (1 + 1/p)
{11 31 6)
11 32* On the hypothes1s of Problem 11 31 find the average trme
t; for wh1ch an arbttro.rtly chosen mach1ne tool that bas fatled will
have to wa1t to be served
Salut1on L\ttl~ s formula '"~ have used tS su1table only ior open
queueing system 'vhere the tntenstty of the arrlval process does not
depend on the state of the system It cannot be used for closed systems
We can fi.nd the ttme-tq by JUSt a.ssum1ng that a demand arrtves (a tool
fa1ls) at a moment t \Ve then find the probabtl1ty that at that moment
the system 'vas In state sh (k ~ 0 m - 1) {tt IS clear that 1t
could not be 1n a state sm) Let u.s cons1der m hypotheses
H 0- the system ''as In state s 0 "hen the demand arr1 ved
H 1 -the system 'vas In state s1 \vhen the demand arr1ved
Rh-tha system 'vas 1n state s~ when the demand arrtved
ifm 1 -the system 'vas 1n state sm 1 when the demand arrtved
The a pr1or1 probabilltles of thc~e hypotheses are Pn p1
P" t Pm-t
\Ve shall now find the a posteriori probab1lttu~s of the hypotheses
gtven that the event A - {a tool has failed 1n an elementary t1me
1nter' a] (t t +
dt)} has occurred On the hypotheses // 0 H1 '
Ch. 11. The Queueing Theory 391

Hrn_ 1 the conditiona l probabilit ies of this event are


P (A I H 0) = m'A dt, P (A I H 1) = (m - 1) 'A dt, .•. ,
P (A I Hh.) = (m- k) 'A dt, ... , P (A I Hm_1 ) ='A dt.
Using Bayes's formulas, we find the a posteriori prob~bilit~es of the
hypotheses (proYided that the event A occurred). Des1gnatm g these
I"'J _, ~ ,..._,

probabiliti es as p 0 , p 1 , • • • , Pln . . . , Pm-I, we obtain


,.., (m-1)p1
mpo
. .
~

Po= m-1 Pt= '


. ,
' m-1

~ (m-k) Ph ~ (m-k) Ph
k=O h=O
,.., ,..,
(m-k) Pk Pm-1
P~t= m-1 • • · ' Pm-1 = • (11.32.1)
' m-1
~ (m-k) Ph ~ (m-k) Ph
k=O k-0

If we know these probabilit ies, we can find the complete expectatio n


of the queueing time of the faulty tool. If a tool fails when the system
is in state s 0 , it will not wait for service; if the system is in state s1 ,
the tool will remain in the system for the time 1/!l on the average; if
it is in state s 2 , then the time is 2/!l and so on. Multiplyin g the proba-
bilities (11.32.1) by these numbers and adding them up, we obtain
m-1
1 "'
-tq = !1 -
LJ kph. (11.32.2)
k=i

11.33. A worker serves four machine-t ools (m = 4), each of which


fails with intensity 'A = 0.5 faults/h. The average time needed for
-
repair trep = '1/~t = 0.8 h. All the processes are stationary Poisson's.
Using the formulas in Problems 11.31 and 11.32, find (1) the limiting
probabiliti es of states; (2) t.he capacity A; (3) the average relative idle
period of the worker P 1d 1e; (4) the average number of tools in the queue T;
(5) the average number of faulty tools w;
(6) the average queueing time
of a faulty tool tq; (7) the average output of a group of tools, with due
regard for their incomplet e reliability , if a working tool produces l
articles.
Solution. 11 = titrep = 1.25 and p = 'Aht = 0.4.
(1) From formulas (1L3L1) we have p 0 = {1 + 1.6 + 1.92 +
1.53 + 0.61}-1 = 6.66-1 ~ 0.150, p 1 = L6p 0 ~ 0.240, p 2 = 1.92p 0 ~
0.288, p 3 = 1.53p 0 ~ 0.230 and p 4 = 0.061p 0 ~ 0.092;
(2) A = 0.850~t ~ 1.06 tools per hour;
(3) P Idle = Po = 0.150;
(4) r- ~ 4 - o.s5o (t + z.s) ~ t.o3;
w
(5) ~ 1.03 + k ~ 1.03 + 0.85 = 1.88;
392 Applltd Problemr in Probabfllty Theorv

(G) from formulas (11 32.1) and (11 32 2) we ha\ e


Po = 4
Po ~ 0 283
4Po+3Pt+2PJ+P3 ,....., '
-'W ..., -

p 1 :::::: 0 340, P2 == 0 2 70, p 3 = 0 108 ~


0 BlPt + 2p2 + 3p 3 ) ~ 0 9&4 h.
- _.. .i<tr# -

tq =
{7) the product! VllY of the group of tools lS (m - w) l ::::: 2 121
1t 3!( .tl n elen1entary mult' serter closed queue,ng system A team of
n '' or~ers handle n-z. machine-tools (n < m) The fntlure process of each
too] ha~ an 1ntcn~1 ty A. and the average t 1me for resetting a tool tsfr =
11tl All the proce~<=-es arc stattonary I 1 ots~on ts Ftnd the IImlting prob-
[~ bl11 t H:'~ of stntes of the queue1ng Sl stem, the absolute cnpnctty A
-
and tho n' (Jrogc number of faulty tools w
Sol utlon \\ c en urn era te the states of the s) stern 1n accorda nee '" 1th
the number of f tul l} tools
s 0 - all the tools are tn norA. 1.ng order, tlre \\o ol"lc rs are 1dle,
s 1 ~one tool 1c; fault v one \\or her 1s busy, the other 'vorlers are
1dle, t

';h. -h. tools arc fault}, h. '' orJ..crs arc busy and the other \\ orf ers are
tdlP (!.. < m), ,
s~ -n tools are faul tl and all the "or~crs arc buS} t
sn +1 -(n+1) tools aro faulty, n of them are being set up and one tool
IS \"\a 1ling for scr\ tee, ,
s,. ~,.--.....(n +r) tools are fault l, n of the1n are be1ng set up and r tools
Tre tn the queue (n r < m), ,
sm.-all them tools are fault), n of them arc betng set up and m- n
tools are 1n tlJc queue
\\-,e tnvtte the reader to ton~truct the dtrected graph of states of the
systen1 on hts o" n and '\ e onl} present the final formulas for the prob
a b 1l1 tIes of s ttt t t.S
Pc={i + ~ P+ m(~~t) P2 +·· +m(m~1) kl (m-k+fl p"
m {m-f) (m-n) n+t
+ 1 nt n P
+ + m (m-1) m-(n+r-t) mt pm}-1,
n! nr + n! n rn-
m _ m(m-t) z
Pi ~ 11 PPo P2 - 21 P Po, '
m (m~ 1) (m--k+ 1) h (t
Ph= kl P Po ~~~n),
m.(m-1) tm-n+1) n
Pn = n} P Po, • • .,
Pn+r ==:... m (m-1.} [m-{n+r-1H pn+rp"' (1 &r~m -1)t
nl nr "' ~ --.

Pm"""" nl ~~-n p'" p 0 , where p =AI~. (11 34 1)


Ch. 11. The Queueing Theory 393

The average number of busy 'vorkers can be expressed in terms of


these probabilities, viz.
-
k = O·po + 1p1 + 2p2 + · ·. + (n- 1) Pn-1 + n (Pn + Pn+l
+ • • • + Pm) = P1 + 2pz + · · · + (n- 1) Pn-1
+ n (1 -Po- p 1 - • • • - Pn- (11.34.2)
1 ).

The absolute capacity for service


- (11.34.3}
A = kft,
and the average number of faulty tools

w= m - ~//.. = m - kip. (11.34.4)

11.35. Two workers (n = 2) are handling six machine-tools (m = 6).


A tool must be reset every half-hour on the average. Ten minutes, on
the average, are needed for a worker to reset a tool. All the processes
are stationary Poisson's. (1) Find the characteristics of the queueing
system, i.e. the average number of busy workers k, the absolute capacity
for service A and the average number of faulty tools W. (2) Find out
whether the characteristics of the system will be improved if the workers
were to reset the tools together, spending five minutes on the average
to reset a tool when working together.
Solution. (1) We solve the problem in the first variant (the workers
work separately). \\Te have m = 6, n = 2, A,= 2, fl = 6 and p = /ol~t =
1/3. By formulas (11.34.1) we have
6 1. 6-5 1. 6·5·4 1. 6-5·4·3 1
Po= { 1+ Ts+ 1·2 32 + 1·2·2 33 + 1·2·22 34
6·5·4·3·2·1 1 }-1 ,. _, 0 · 1-3
+ 6·5·4·3·2 1
1·2·23 36 + 1·2·2 4 3" ,.._, b '
p 1 = 6/1 X 1/3p 0 ~ 0.306.
We use formula (11.34.2) to find the average number of busy work-
. -
ers, I~. l~ = 1· p 1 +
2 (1 - p 0 - p 1) ~ 1.235. The absolute capacity
: = k~t ~ 7.41. The average number of faulty tools = 6 - 7.41/2 ~
.30.
w
(2) If the workers work together to reset the tools, then the queueing
system turns into a one-channel system (m = 6, n = 1) for ~t = 12.
We do the calculations for p = ')..f~t = 1/6. By formulas (11.31.1) we
get
p =
0
{1 + 1+ 6·5 6-5·4
62 + 63
6-5·4·3
+ 64 + 6"6·5·4·3·2
I
1 6-5·4·3·2·1
66
}-1
:::::::0.264, p 1 ~ 0.264, p 2 ~ 0.220, p 3 ~ 0.147, p_1 ::::::: 0.076,

Ps~ 0 . 02 !.1, PG~ 0 . 00".g, -w= 6 - °'736


1/6 ::::::: 1.:J.
-g
/
.394 App lied Problem1 tn Prob 4btl tty The ory

The average number of busy cha nne ls k = 1 - Po = 0 736 Bearing


1n m 1nd ho, ve' er, tba t 1n thts case the "channel'" cons1sts of two ~ ork
ers,. the ave rage num ber of busy 'vor},ers lS
l = 2 X 0 736 ~ 1 47 and A ~ kp. = 0 736 X 12 ~ 8 8
Thus.. 1n t h ts case, by ass1st1ng each oth er, the workers (channels)
Increase the average bus y pertod from 1 23 to 1 47, redu ce the average
num ber of fau lt) tools from 2 30 to 1 59, and 1ncrease the capactty
from 7 4 to 8 8
11 36 J ohs arrt "e at an elem enta ry thre e cha nne l congestion system
In a process "'1t h 1ntens1ty A ~ 4 JObs/mtD the service t1me per Job
1

per channel ts~r = 1/J.t = 0 5 mtn Is 1t better~ from the poi nt of v1ew
of the cap actt y of the sys tem , to make all thre e cha nne ls serve each
Job together ~o as to decrease the ser\ Ice t1m e of a JOb to a th1rd? How
wil l thts affe ct the ave rage \Va ltin g t1me of a JOb?
So1 utlo n (1) \\' e find the probab Il1t tes of stat es of the sys tem wtth
out any Interchannel asst stan ce, us1ng Erl ang s form ulas (11 0 6), 1 e
21 2' } - 23
Po= { 1 ,
2
11 +2 ! + 31 t
::::::::0158~ P,e 1 ==p3 = 31
p 0 ~021,

Q = 1 - Pr•t A = 'AQ ~ 3 16
~ 0 79~
\\~e then calc ulat e the ave rage 'vat ting tim e of a JOb as the pro bab dtty
Q tha t a JOb \Vlll be ~erved mul t1pl 1ed by the ave rage service tim e, 1 e
-tw ~ 0 79 0 5 ~ 0 395 min
(2) If \Ve comb1ne the thre e cha nne ls Into one w1t h para met er f-1 =
3 X 2 = 6 "e obt ain
1
Po= (i+ 2/ 3)_1 = 0 6, p 1 = (2/3) 0 6 =0 4,
Pret = Pt = 0 4, Q = 1 - Pret = 0 6, A = i..Q = 4 X 0 6 == 2 4
Com par1 ng the cap actt y of thts sys tem to tha t of the first vart ant,
we ~ee tha t 1t has not r1se n (as 'vas the case 1n the prec edin g prob lem ),
tnde ed 1t has fall en' It ts easy to und erst and ,vhy thts hap pen ed \Vben
the t\vo cha nne ls '"er e com btn ed, the pro bab tltty of refusal 1ncreased
(the pro hab 1l1t \ tha t an arri vin g Job find s bot h cha nne ls busy and
dep arts unser"ed)
The a' erag e "a1 t1ng ttm e of a JOb 1s sma ller 1n the second var1ant
tha n In the first rw
= Q (1/6) =0 1 mln Ho" eve r, thiS decrease has
a h1g h cos t Sinc e 1t comes at the exp ens e of a num ber of JObs departtng
uns erve d and , hen ce, the ir "at t1n cr t 1me tS zero
\Vh y the n In the pre red tng pro~blem dtd com btn1 ng two cha nne ls
Into one Incr ease the efficiency of serv ice? \Ve rnv1te the read er to
thtn k thts o'e r and exp lain the seem ing con trad tct1 on Is It because
tn a sys tem "·1 th refu sals JObs do not JOin the que ue?
11 37. \'7e are gt,e n an elem enta ry thre e-ch ann el system w1th an
unb oun ded que ue The 1nte ns1 ty of the arri val proc ess 1s ],., = 4 de
man ds/h and the average serv ice ttm e taer = 1/Jl = 0 5 h
Ch. 11. The Queueing .Theory 395

Bearing in mind (1) the average length of the queue, (2) the average
queueing time of a demand and (3) the average waiting time of a de-
mand, is it advantageous to combine all three channels into one with
an average service time three times as small?
Solution. (1) In the original variant (three-channel system) n = 3,
'), = 4, ).t = 1/0.5 = 2, p = 1./).t = 2 and x = pin = 2/3 < 1. The
limiting probabilities exist. We calculate p 0 by formulas (11.0.21):
2 22 23 24. (1/3) } -1 1
Po= { 1 + + + +
11 2! 3! 3! (1-2/3) =g-'
- 24 ·1/9 8 -
-r 2
r = 3·3! (1/3)2 =g- ~ 0.889, tq = r=g- ~ 0.222 h,
- -
tw = tq + t er ~ 0. 722.
5

(2) When we combine the three channels n = 1, f. = 4, ).t = 6,


and p = 2/3. From formulas (11.0.12)-(11.0.14) we have
- (2/3) 2 - r
r= ~ 1.333, tq = -y: ~ 0.333 h,
113
tw = tq + tser = 1/3 + 1/6 = 0.500.
We see that although combining three channels into one slightly
reduces the average waiting time of a demand (from 0. 722 to 0.500),
it increases the average length of the queue and the average queueing
time of a demand. This is because the arriving demands must wait
in the queue while the three channels serve a demand together.
The increase in the service efficiencv when channels are combined
in a closed queueing system is due to tl;e fact that the intensity of the
arrival process of demands decreases when their sources (machine-tools)
fail.
11.38. We consider a queueing system which is a taxi-rank at which
passengers arrive in a stationary Poisson process with intensity 'A
and cars arrive in a stationary Poisson process with intensity ~t. The
passengers form a queue which decreases by unity when a car arrives
at the rank (we take a case when the taxi-driver drives the passenger
wherever the latter wants to). If there are no passengers at the rank,
t~e cars form a queue. The number of parking places at the rank is
hmited (is equal to m) and the number of places in the queue for pas-
sengers is also limited (equal to Z). All the processes are stationary
P?isson's. A passenger boards a taxi instantaneously. Construct the
d!l'ected graph of states of the system, find the limiting probabilities
of stales, the average length ~~ of the queue of passengers, the average
~ength ~ of the queue of cars, the average queueing time per passenger
tq.p• the average queueing time per car tq.c and see how these charac-
teristics will change as m -+ co and l -+ co .
.Solution. \\7 e shall enumerate the states of the system in accordance
Wlth the numbers of passengers and cars at the taxi-rank, labelling /'
390 Apphed Problems In Probability Theory

them by t'\O 1ndtces the ftr~t tndex for the number of passengers and
the second for the number of c1.rs The state s 0 0 means that there arE
ne1ther pa~~engers nor c1.rs at the rani"!.., the state s0 h means no cars and
k pas,engers 1 the state sc 0 means c c.ars and no pas~engers The d1rectec
graph of states of the ~}~tern ts sho'' n In F1g 11 38 The graph corres
ponds to the b1rth and death process Employ1ng the general formula~
(11 0 4) for th1s process and des1gnat1ng Alfl p we obta1n =
o = PPl Ot Pl-2 o = P P1 o• • •• Pe o = P ~Pl o• • • •"
2 1
P1-1

P-D o -_ p!p l 01 t
p 0 k -__.. . pZ+IIp l 01 • • •' p 0 m -- pl+mp I Ot
Pl., = {1 +
p + p2 + .. " + p1+m}-1 (11 38 f
or c;: ummtng tho geometric progres~ton 'vt t h a common ra l1o p.
P1.0 = (1 - p)/(1 - pl+m+l) (1 1 38 2
The prob1btl1l1e~ 1n (11 38 1) form a gcometr1c progresston "1th tht
first term p 1 6 and a common r1tlo r If p > 1., then the most prohah11

t>---.l ..t A l l l ... l


S1 11 • •
PP.
0:~ sco~~. • ~ S,o, .. '" [~, 8j::j s~,l
P Jl P
:• •·;{~ ~: ": ••_
p p
s4,.

Ftg 11 38

state of tbe S) stem lS s 0 m "h 1ch means thn t there are no c 1rs and al
tbe places 1n the queue of pnc:.co.engers are occup1ed, 1f p < 1, then th~
most probable stato ~~ s1 0 , "h1ch means that there are no passenger~
and all the places 1n the queue of cars aro occupied
As m ~ oo and l ----... oo no lim1t1ng probabtltttes e't.lst .. for p > 1
the queue of pas~engers and for p < 1 the qucuo of ca~ tend to 1ncreas~
1ndefi.nttell (but th1s tenuonc} 1s rcstr1cted by the fact that ne1the1
the number of pa~sengers nor the number of taxes 1n the ctty 1s 1nfin1te)
1 t 39* In a sr-Jf servJce canteen there JS one t:hst.rJhntJon rounte:
at '\ htcb both the first and the second courses are ~er\ ed Customer~
arr1ve at the canteen tn a stationary Poisson process "Ith 1ntens1ty ).
the service of the first as \veil as the second course takes a random t1mi
"htch h 1s an e.:\.ponenttal distrthutton \VIlh the same parameter J.l.
Some customers tal\o the first and the second course (the fraction o:
.such customers Is q) \\ hde the others take only the second cou1se (tht
fraction of them IS 1 ~ q) F1nd (1) the cond 1ttons under 'vh1ch therE
15 a stable, st1.t1onary state of ~er' ICe at the canteen, (2) the a\ erage lengtt
of the queue and the average wa1t1ng ttme of a customer 1£ he need~
~ •..,m& -.. 1.!ll \1a:, x . . ·ut~ 'to -ea\ one course ann a •ttme ~ ~o ~1-ft WJ~
courses
Solut1on The c;.ervJce time of one customer IS a random var1abl1
T whrch has e1ther an Erlang dJstrlbutton of order 2 'v1th the meat
-
value t 5er = 2/J-1 WIth probabtll t y q or an exponent 1al dIS trtbutiOl
Ch. 11. The Queueing Theory 397

with parameter ~t with probability 1 - q. Let us find the mean value


of the random variable T. To do that, we use the complete expectation
formula (4.0.20) with two hypotheses: H 1 = {a customer takes only
ihe second course} and H 2 = {a customer takes both courses}. The
probabilities of these hypotheses are P (H1 ) = 1 - q and P (H 2 ) = q.
The complete expectation of the random variable T
M [T] = P (H1) M [T l H 11 + P (H 2) M [T l H2] = (1- q) (1/~)
+ q (2/~)+ 1)/~.
= (q
This means that the canteen can serve ~/(q + 1) customers per unit
iime on the average. If 'A;?:: ~/(q + 1), then the system is overloaded
and limiting probabilities do not exist, but if 'A< ~/(q + 1), then
they exist. We assume 'A< ~/(q + 1).
-
To find the average size of the queue r, the average queueing tq and
-
waiting tw times of a customer, we use the Pollaczek-Khinchine for-
mula ('11.0.31). To use this formula, we must know the coefficient of
variation of the random variable T, i.e. the service time. \"1\Te first
nnd the second moment about the origin of that variable M [ T 2]. From
the complete expectation formula (with the same hypotheses H 1 and
H 2 ) we obtain
M [T 2 1 = (1 - q) M [T 2 I H 1 ] + qM [T 2 I H 2 l. (11.39.1)
On the hypothesis H 1 the random variable T has an exponential
distribution with parameter ~t:
1\I [T 2 I H1 1= Var [T l H 1 ] + (M [T I H 1 ]) 2 = 1ht2 + (1/J-1) = 2
2/J-12 •
On the hypothesis H 2 we calculate the second moment about the
<>rigin of the variable T by the formula
l\tl [T2 I H 21 = Var [T I H 2] + (M [T l H 21) 2 •
~ut l\I [T I H 2] = 2/~, (M [T I H 2 ]) 2 = 4/~ 2 ; the variance Var [T I H 2 1
1s twice as large as Var [ T I H 1 ] since it is the variance of the sum of t1vo
independent random variables T1 and T 2 which each have the same
<listribution, i.e. Var [T l H 2 ] = 2/~ 2 • Consequently
l\1 [T2 l H 2 ] = 2/~t 2 + 4ht 2
= 6/~ 2 ,
whence we get by formula (11.39.1)
l\1 [T2] = (1 - q) 2/~t 2 + q6/J.L2 = 2 (1 + 2q)ht 2 •
The variance of the random variable T
Var [T] = 1\-1 [T 2 ] - (1\1 [T]) 2 = (1 + 2q- q )/J.L2 2

From this we can find the coefficient of variation of the random vari-
able T, i.e.
v11 =)! 1 + 2q- q /(q+ 1).
2
398 Apptred Problemt fn Probabflity Theory

Substttuttng th1s express1on and the expressxon p = A. (q + 1)/11


1n to the Pollaczek K htnch tne formula ( 11 0 31), " e get

- <"-'Ill'> Cq+ 1)' [ 1+ t ~~;;q1 ] {1..'/ll•> rt +2qJ


r = 2(1- (~/Jl) (q+ 1}) = 1-(X/!J) (g+ 1}
Furthermore, t 9 = Tl)., and f;. = Iq + (q + 1)/~-t + (q + 1) ~ =
tq +
(q + 1) ("t + 1/J.l)
1t 40 An example of an elementary congestlon system w~th prlordy
Customers .nrr1 ve at a two c:erver congest ton system In t\\ o stattonary
Potsson processes process I ""1th 1n
J , JI
~
tens1ty A1 and proce~s I I '' 1th 1n
S,o tens1 ty A2 (for brev1t~ "e shall call
r

Soo - Sza
Pt Zp, them "customers 1 • and 'customers
II ) Customers I ha'e a pr1or1ty
I
p t. "' Af I p.p lt At
lt of ser' 1ce O\ er customers II such that
lr

s1f Jf a customer I arr1ves wben the .ser


Sot ...
vers are busy and at least one of them
I f1
P.t IS serving customer II then the arr1v~
2p At J, 1ng customer I d1splaces customer
I I occup1es h1s place and the latter
Saz departs unserved If a customer I
arrtves 'vhen the ser\ ers are all serv-
rtg t t 4.0 tng customers I, he 1s refused and
I e,' es tho system cu~tomer I I IS
ref used tf he arr1 ves "'hen both servers are busy (no matter "h1ch
customers are betng served)
Construct .n marked dJrecterl graph of states of the queneJng SJo stem
Ia bclltng the states by the tnd 1ces ( i, 1) the first Index sho\\ 1ng the n urn
her of customers I and the second. the number of customers II present
1n the system \Vr1te the equattons for the ltmlttog probabi]tties of the
states Solve them at A1 = A~ = ~~ 1 = 112 = 1 Express the followtng-
efftciency characteristiCS of the system 1n terms of p 'l (t J ~ 2) +
Pr~t (Pler)-the pro.Oablltty that customer I (I I) will he refused.
when he arr1 ves,
A1 (A2)-the average number of customers I (II) served by the queue-
Ing system per un1t t1me
k~ (~)-the average number of servers busy w1th ~er\Ing customers
I (II)
P ret A. k-the same charactertstJcs for the system as a ''hole,
1rrespect1 ve of the type of customer
Solution The states of the system are sn 0 , there are no customers
1n the system s10 , there IS one customer I and no customers II 1n the
system So 1 there are no customers I tn tho system and one customer II
S2ch there are t'"o customers I 1n the system and no customers II
S11 there ts one CuQ,tomer each I and II, s~ 2 there are no customers I
and t\\o customers II 1n the system The marked graph of states of the
system IS sho'" n 1n F1g 11 40
Ch. 11. The Queueing Theory 399

The equations for the limiting probabilities of states:


(/.,1 + A.2) Poo = ~1P1o +
(1..1 Az ~1) P1o = A!Poo + 2~l1
~2P01• + +
X P2o +
~2Pw 2~1P2o = A1P1o A1P11• (1..1 A2 ~t2) + + +
X Po1 = AzPoo ~!Pn + 2~zPo2• (A.l +f-t1 f.-t2) Pn = A2P1o + +
+A1Po1 +
A!Po2• (1..1 +
2~2) Poz = A2Po1• Poo P1o P2o Pot + + +
Pn Po2 = 1.+ +
Solving them at /.,1 = A- 2 = ~1 = ~2 = 1, we get
Poo = 0.20, p 10 = 0.25, P'.!o = 0.20, p 01 = 0.'15,
Pn = 0.15, Po 2 = 0.05, = P2o = 0.2Pm Pm
= P2o + Pn +
Po2 = 0.40, A1 = A.1 (1 - P~m = 0.8.
We calculate the variable A 2 taking into account that some custom-
ers II that have started to be served are displaced by customers I
and depart unserved. The average number of such customers per unit
time is 1..11 (p11 +
p 02 ) and, consequently,
A2 = l..2 [1 - (p2o + Pu + Po2)] - A.1 (pu + Po2) = 0.4,
k1 = A 1 f~v k2 = A2/~ 2 .

The probability Prer that an arbitrarily chosen customer arriving


at the system will be refused can be found by the total probability for-
mula with hypotheses H 1 = {a customer I has arrived} and H 2 =
{a customer II has arrived}. The probabilities of these hypotheses are
P (H1) = 'A1/(A.1 + 1.. 2) and P (H2) = 1.. 2/(A-1 + 1.. 2 ).

Consequently
p
rer = 1.,
J..1
+ t,. pn>
ref
+ ~
t,. + 1.,
pc2> _ 0 3
ref - • •
1 2 1 2

Note that we could obtain all the characteristics for customers I by


completely ignoring customers II and treating the problem as if only
customers I arrived at a two-server system with refusals. 'Ve inYHe-
the reader to Yerify this by calculating all the characteristics for a two-
server system with refusals at which only customers I arrive.
11.41. The conditions of the preceding problem are changed so that
the number of servers in the system is n = 3. Construct the directed
graph of states of the system. 'Vrite the equations for the limiting
probabilities Pii (i + j ~ 3), where i is the number of customers I
and j is the number of customers II which are present in the system. Con-
sidering the equations to have been solved, express the same efficiency
characteristics as enumerated in the preceding problem in terms of p 11 •
Solution. The directed graph of states is shown in Fig. 11.41. "
/
40A?_. t pplllf!d Problem • _!n Proba. blltt y Theo~y

The equ at1o ns for the ltm1t1ng probab1l1tt~s of stat es are


(~t + J-2) P oo =
(At A2 l-11) Pt o !lU'to + Jt2P ot, + +
= A1PoG 2f-!tP2o +
fLtPlt, {itt + At + 2(.11) Pto +
= ~1P1o + 3fltPso + P..tP2t•
(~1 + i..e + p.,) Po1 = 7tw,PoD + Jl:P lt + 2ft2Po:,
fAt + ) t + IJ1 + Pz) Pit = ArPot + A-JPta + 2PtP<Jt + 21l,P12•
(A.t ~l2 + +
2~Ll) P21 = At (ptt + P12) + A-tPto' (~ A2 + 2l!2) P{J2 +
+
== i ~Po1 + P1P12 3J12Po3, (A,l }1 1 + 2~t:) P12 At (po2 Po3) J..2P11' + = + +
(11 + 3112) Po 3 'A 2P o2 , =
Pco + Ptn +
P2o _,... P3o + Pet + P11 + P21 + Po2 + P12 + PD! = f,
Pr~~ = P3o P~~~ = Pao P2t P11 +P ost + +
A1 = At {1 - PJo} Aa = A-2, (1 - (p:Jo + P21 P12 Po,) J + +
-- A1 (pc3 + P12 Po~), +
kt = Atf~Ltf' 'kz = AtiJL,f k == kt f-kzt'
At+ r;; ref + ~ ~
p tef = "'1 pu) +~ ~ Prtlr
1 J 42 An ezam pJe of a queue~ng system wztk absolute przorlty Custom
cr.s I and I I arrt ' e at a s1 ng le ~er' er que uetn g S) stem 'vl th two places

Af
1n the queue (m = 2) 1n
l 1 ~
o
l
t''
stat ion ary Po1st0:,on processp,s
Son - - S,o
__..., 1
~
~.

/1, 2p1
S2o - SJo \Vtt h 1nte ns1t ies J..1 and A.2
... '
Jp, The ser ' 1ce t1mes are expo
I~
A,1 Pt A.2 flt ~.? nen t1al '' tth para met ers }11 and
}lz ..~.,
'
J, ~ l, _......... .,..
\ It
~' A customer I who arr1ves
Stv - $,t, at the sys tem dts places custom
Stf
er I I tf he ts betng serv ed or
~

}1} 2}/-t
{~
take s a plac e 1n the queue tn
.A. 2, 2pz At 2p2 fron t of h1m 1f he IS 1n the
l1
J;t ~ que ue The d1s plac ed cust ome r
Sal .,. S1t II leav es the sys tem unser\ ed
P.t 1f th~re are no mor e places tn
I~
..t 3ft-: Jt the que ue or JOlns the queue
2
\It
1f ther e are pJaces Lah ell1 ng
SoJ
the stat es of the sys tem '\\ 1th
the 1nd1ces ' and J eorr e
spo ndt ng to the num ber of
F1g 11 4 t cus tom ers I and I J pres ent 1n
the sys tem , con stru ct a marked
dtre cted gra ph of stat es of the sys tem and wt1te the equ atio ns for
the 11 mt t l ng proha btl 1t1es of stat es Con s1d er1n g thes e equ a t1on s to hav e
been c;ol \ ed. ex press the follow1 ng eff1c1ency cha ract erts tics of the
sys tem tn term s of p 11 (~ J ' 3) +
Ck. 11. Tlte Queueing The ory 401

Pm (Pim -th e pro bab ilit y tha t a cus tom er I (II) wil l be refu sed
imm edi ate ly afte r his arr iva l;
Q1 (Q 2 )-t he pro bab ilit y tha t a cus tom er I (II) will be ser ved ;
~ (i'; )-th e ave rag e num -
ber of cus tom ers I (II) in f f 1
Soo SID . Szo s'Jo
the que uei ng sys tem ; ,J
Jl, P.t Pt
i;_ (r~)-the ave rag e num ber
of cus tom ers I (II) in the J 2 /12 Az flz ltz Pz
A., 11,
queue; 2t
- >)-the ave rag e wa it-
-t~J> (t~ Sot '- s,,
Szr
ing tim e of cus tom er I (II) ; !Lt l't
- -
tq> (t<~>)-the ave rag e que ue- J 2 /lz tlz P.z 'A-r
ing tim e of cus tom er I (II) ; J, t
-
tw- the ave rag e wa itin g Soz Stz
tim e of any (ar bitr ary ) cus to- fLt
me r J 2 h. il,
-'
tq- the ave rag e que uei ng
tim e of any cus tom er. So:J
Sol utio n. The sta tes of the
que uei ng sys tem are sii• i
cus tom ers I and j cus tom ers II Fig. 11.42
1 are in the sys tem (i +
j ~ 3).
' The ma rke d dir ect ed gra ph of sta tes is sho wn in Fig . 11.42.
The equ atio ns for the lim itin g pro bab ilit ies of sta tes are

(At + A2) Poo = 1-ttPto + l-t2Po11 (At + A2 + I-tt) Pto = AtPoo

+ !-ttP2o + l-t2Pt11 (At + A2 + I-tt) P2o = l•tPto + 1-ttPao + !-t2P2h


f.ttPso =A t CP2o + P2t), (A1 + A2 + ~t2) Po t= A2Poo + 1-ttPt + fl2Po2•
0-t + '-2 +I- tt+ fl2) Pu = A2P1o + l•tPot + !-ttP21 + Jl2P12•
(At + /..2 + 1-t2) Po2 = AzPot + !-ttP12 + !-t2Poa·
: +
(At + I-tt+ 1-t2) Pt2 = '),tPo2 + A2P1t. + /•tPos· (l,t 1-t2) Poa = A2P02•
+
Poo+ Pto + P2o + Pao + Po t+ Pu P21 + Po2 + P12 + Pos = 1,
;

..•
P (rer
l)-
- Pso.
pr•l
rer = Pso + Pzt + P12 + Poa , Q 1 -
- ·1 - pm
rer -
- 1 - p so·

t To fin d Q2 , we first see k A 2 wh ich is the ave rag e num ber of cus to
' me rs II ser ved per uni t tim e, i.e.
If

n A2 = f-2 (1 - Pim -A t (Po3 P12 + P21) +
'~
I'•
= A2 [1 - (p3o +
Pzt + P12 Po3)1 - 1,t (po3 + + Ptz + P21).
Div idin g .this exp res sio n by /, 2 , we find the ave rag e fra ctio n of cus tom - •

ers II bem g ser ved (the pro bab ilit y tha t a cus tom er II wil l be ser ved ).
'2G- 0575
402 Applied Problems tn PrababllUy Theory

i e Q2 = A !/). 2 Then
Z1 = 1 (plo + + Pl!) + 2 {p2o + P21) + 3p3o' Z2 = 1 (Pot + Pu
P11

+ P:t) + 2 (Po2 + P1:z) + 3po3 ~ = 1 fP1.o + P-zr) + 2p,a


-r~ = 1 (pll + P21) + 2P12
B} Littles formulas

\\ e can c1Jrulate all the charncteristtcs rel1.t1ng to customers J


" 1 thou t '")O 1v 1n g eq u1. t 10ns (11 42) but s 1m ply by 1gnor1ng the pre~ence
of cu~tonl{\rs II and cons1dertng tho S} stem to be one Wtth a l1m1ted
number of places Ln the queue (m 2) 1nd us1ng the formulas 1n
Sec 11 0 (ttent 1) to find Its charactertsttc~
t 1 t..3 An elem en.tary ro;ystem u tthout a queue and wlth a hea hng up"
of the channel~ Demand<; arr1' e at tl1o 1npul of an n-channel queueing
S.} c;;tem 1n " t;;tf nt1onary PoJ~~on proce<=s 'v.Jth l11lcnsJty A. The ser\JCP
ttme l!;J e\. ponent1 ,l "tt h p1rameter ~t Before the c:er\ ICe ts begun the
channel n1u~t be prepared ( heated up ) The t1me needed to heat t
the chan np I The a has "l. n expo nent1a 1 d ll{t r IhUt ton "'1 th parameter v
and doc~ not depend on ho"· long before the channel ceased functtontng
A dem1nd '' h1ch 1.rr1 ves \\hen n channel IS tdlc occupies 1t and "a1ts
for 1t to bo heated up 1nd then its ~ervtce commences A demand v..h1cb
arr1' es "'\\hen ~ll the chnnnels are bus) ("hen another demand ts be1ng
~er\ ed or ts \V1Il1I g to he "'Crved) departs unser\ t?d Ftnd the l1m.1t1ng
prob 1.b 1l1 t 1es of the ~' ~t em and the efficiency ch nracter ist1cs 1 e the
prob1h1ltty of refll~"ll P rtf the relat1' e cap1c1ty for servtce Q the
absolute C1.p'1ctty A and the a\erage number of busy channels k
Solution \\ c assume that the JOb ~er' 1c1ng procedure consists of
t\vo phac=.es '''lit tng for a channe] to be heated and the servuang Itself
-
Tser - T h~at +
...,
TIPr The random '\ 1.r1ahle Tser has a generaltzed
Erlang dtstr1but1on of order 2 ((;;ee ProblPm B 39) w1th parameters 1-l
and v 'Ve kno\v that Erlnng s formulas (1 1 0 6) are 'altd not only for
e'-.ponential but also for any dtstribution of serv1ce t1me Let us find
tbe varJabJe 11 - 11.1\I l Tser1 '\Ve have
- f"'W

f"'W

~I ( Tsf:r] = ~~ ( T b~atl + I\1 I Tser1 - 1/1-l 1fv = (!-1 + +


v)/(~c,v)
'vhenc& ~ = (t-tv)I(!J
,....,.
+ v) H"'tv1ng calculated = AI~ and p
substttuttng
th1s value of p 1nto Erlang s formulas (11 0 6) ,ve obta1n
hoJ ,...., ,..,

Po= { 1 + 1f + + r; + + p'"' }
nl
,.., ,.,;
pn
Prer = P" = nt Po A=AQ=A( 1- :; Po)
Ch. 11. The Qlleueing Theory 403

-
To find the average number of busy channels k, we must divide
,...,
A by 11:


11.44. An elementary one-channel queueing system with a "heated"
channel. Demands arrive at a one-channel queueing system with an
unbounded queue in a stationary Poisson process with intensity 'A.
The service time is exponential with parameter ~t (~t > 'A). Before the
A. A. A. J
Soo Sot Soz •
So'J •••

!) !) J}
p
f A. t A. 'f . J
sf! Stz .Srs •••

Fig. 11.44
demand is serviced, the idle channel must be heated. The time needed
to heat the channel is exponential with parameter v and does not de-
pend on how long before the channel ceased functioning. If the service
hegins immediately after the service of the previous demand has ended,
no reheating is necessary. Construct the directed graph of states of the
queueing system and write the equations for the limiting probabilities
of states. Express the efficiency characteristics of the system in terms
of these probabilities: the average numbers of demands in the system z
and in thP queue r
and the average Waiting and queueing timeS
of a job 7w and ~ respectively.
Solution. The states of the queueing system are (Fig. 11.44):
Soo-the channel is idle but not hot;
Sol-a job has arrived and is waiting for the channel to be heated;
Su-the channel is hot, one demand is being served, there is no queue;
So2-the channel is being heated, there are two demands in the
queue, ... ;
Sol-the channel is being heated up, there are l demands in the queue;
8u-the channel is serving demands, l - 1 demands are in the
queue, . . . .
The equations for the limiting probabilities are
1•Poo = !lPn, ('A +
v) Pot = APoo• ('A ~t) Pn = VPot f.LPw + +
(A. + v) Po2 = APott ('A +
!1) P12 = VPo2 'Apn ~LPt3• • • • 1 + +
('A +
v) Po.z = APo.l-t•
('A + !l) Pt.z =
00
+ APt.z-t + ~tP1.1+t• • • • ,
VPo,z
00

_21z (Po. z+Pt. z)f r= Lj l (Po. z + PJ.r+J)•


z= 1=0 l=t
404 Appllt!d Probl~ms ln P7obabllftv Thtory

B~ Little's formula ~ ._. ...... ......


tw = z./'A, tq = r/~
11 45* 'Ve are g1\en a stngle ~erver system w1th a l1m1ted number
of places 1n the queue m = 2 Customers arrtve at 1t 1n a stattonary
Po J.s~on process 1v1th 1 n tens 1t y A.
The ~erv1ce ttme has a generaltzed Erla.ng distribution w1th para
meters J-1 1 and ~t 2 (see Problem 8 39) F1nd the proba.btltttes of the states
of the system
s0 -there are no customers 1n the system,
sl-there ts one customer 1n the S)ostem (no queue)_,
s 2 -there are t\\O customers 1n the system (one 1s be1ng ~er\ed and
the other 1s "a1t1ng to be served),
s3 ~there are three cu.stomers 1n the .SJ stem (one 1s be1ng .ser1 ed and
t" o are 1n the queue)
( 1) F1n d the effic1ency charac ter1st 1cs of the system P ref" Q, A 'i
---
t, fw tg ( 1) Calculate tham for the values A = 2, ~J- 1 = 6 and J.l- 2 = 12
(2) Compare them v, 1th the characteristics 'vhtch would result for an
elementary queuetng system wtth the same )., and 1-1 equal to tl~er'
-
\vbere taer 1s the a\ erage service t1me of a customer tn tb1s system
Solution (1) The ~erv1ce process J.S not PoJsson is and, tbere!ore, the
system 1s not ~larkov tan Thts means that to find the probabtltttes of
states of the systemt we cannot use the
<J rd1nary techn 1ques we used for .i\farkov
processes With d1screte states and con
ttnuous t1me However, we can art1fi
C1ally reduce the process taktng place
1n the system to a J.,larkov process by
applying the so called method of phases
\Ve assume that the service lS tn t"'o
phases (I and II) wbtch last for the t1mes
T 1 and T 2 respectively The total serltCe
ttme Tser = T1 +
T2 , 'vhere T1 has an
exponent1al distrlbution w1th parameter
(.1 1 and T 2 has an exponenttal d1strtbu
t1on \VIth parameter p. 2 Then T,er W1ll
r f~ have a generalized Erlang dtstr1but1on
lg li.a 'v1th parameters ~Jot and liz (see Problem
8 39)
\Ve 1ntroduce the fol1ou 1ng states of the system
so-the system IS 1dle
s11-there 1s one customer In the system, the service 1s tn the first phase
s12---there 1s one customer 1n the system, the serv1ce 1s 1n the second
phase
s~n -there are t\\'0 customers tn the system (one 1s betng served and the
other Is \Vatting to he served) the service IS 1n the .first phase,
S22--there are t'vo customers tn the system, the service 1s 1n the second
phase
Ch. 11. The Quwemg Theory 405

s31 -there are three customers in the system, the service is in the first
phase;
s32 -there are three customers in the system, the service is in the
second phase.
There are no other states since (m = 2 by the hypothesis ) there can
be no more than three customers in the system. The marked directed
graph of states of the system is shown in Fig.11.45 . Using this approach,
we divide the state s1 into two states: s11 and s12 (or, in a brief form,
s1 = s11 + s12). Similarly s 2 = s 21 + s 22 and sa = s 31 + sa 2. The equa-
tions for the limiting probabilit ies which correspond to the graph in
Fig. 11.45 are
lvpo = ~2P12• (A. + J.l1) Pu = A.po + J.l2P22• (A. + ~t2) P12 = ~t!Pn•

with the normalizin g condition Po + p 11 + p 12 + P2 1 + P22 + Pa1 +


Pa2 = 1.
Solving these equations, we obtain
Po= {1 + <~+ fl 2) ~ + I. + J..s <A.+ ~t1 + !12l ~ A!12 <A.+ !12l
2

flt!12 112 fli!12


+ A2 (f..+ Ill+ fl2) + A4 <~+ !lt + !12) + A3 fl2 (f..+ !12)
flt !l~ 1-li~t~
+ ')..S (f..+ Ill+ !12) + A4 (~+Ill+ !12) + ~3 !l2 (!..+ !12) } -1 '
~llfl~ !lr!l~

Pu= _ I. p p _ i..3 (i..+l-lt+!lz )+l..2 flz (A+!lz}


P12- !1z 0• 21 - ~lj!lg

_ A2 (J.li+llz+A )
P22- ~tift~ Po.
Pst = /.4 (I..+ ~t~ + !lz) + f.3~t2 (!.+ !12)
~tifl~ Po,
Paz= [ }.4(f..+~tt+!12!+l..sfl2(f..+~t2)
!lifl~
+ I,S(f..+!li+!l2) J
!lt!l~ Po
. (11.45.1)

Next '"e find the limiting probabilit ies of the states s 1 ,


f..
Pt = Pu + Ptz =
!ltfl2
(A.+~~ + ~2) Po.

Ps =Pat+ Pa~ = !l~:t~ [(A-112 +A-~1 + ~1) (A.+~~+ ~z)


+ ~~~ (~1 + ~2) (A.+ ~2) ]Po. \ 11.45.2)
where Po is defined by the first formula (11.45.1).
406 Applied Problems ln Probabflltg Theory

The effic1enc y characterJ.st lCS of the s) stem can he found 1n terms of ,


the probah1l1 t1e.s Pnt p 1 t p 2 p 3 b} the formulas
Pret = p3, Q = 1 - Ps.. A = A. (1 - Pa)t
-z = 1p 1 + 2p + 3p2 3,
-r = 1p + 2p
2 3,

~ = ziA. tq = -;,,_ (tt.4s 3)


Substituting the numertcal data A. = 2, J.1 1 = 6, J-1 2 =
12 1nto for
rnulas (11 45 1), '' e obta1n Po = 0 540, P 11 = 0 182, P12 = 0 090t
p 21 = 0 087 p 22 = 0 050, p 31 = 0 029 snd P 32 = 0 022 'Ve nO\V
return to the 1n1t1al (not l\larko' 1an) queuetng system and obtatn
Po = 0 540 p 1 = 0 272 p 1 = 0 137 and p 3 = 0 051
Furthermore \\a ha' e from formulas (11 45 3) Pr,r = 0 051, Q =
0 940 A = 1 89 z = 0 705 r
= 0 243f tw
= 0 352 and tq = 0 122
(2) \\ e calculate the same characteristics for .an elementary queueing
S) stem "tlh the same iw == 2 and p. = (1/JJ. 1 + 1lft 2)- = 4 From lor·
1

mul1s (11 0 12) (11 0 15) ''e ha\e p = 0 5 Po ::::::; 0 533, I = 1 -Po~
0 467 p 1 = pp 0 ~ 0 267 p 2 = p2p 0 ~ 0 t33J p 3 = p3p 0 ~ 0 067,
z
Prer = p 3 :=:::: 0 067 Q = 1 - p 3 ~ 0 933, A ~ 1 866, ~ 0 7331 r =
o 2G7 1-w ~o 367 Jnd tq ~o 133
\\'e Qee that our nonmarkovtan queuetng Slstem has some ad\antages
o'er an elementarl queueing system as concerns the capacity for servtce
and dtffers but l1ttle from 1t (to the better) as concerns the "a1t1ng ttme
of a cus tamer and the s 1zc of the queue.
t t 46* There 1s a sIngle co:er' er queue1ng system w1 th t" o plaees
1n the queue Custon1ers arri'\ e at 1t 10 a Palm process, "1th an Inter
arrt\ al t1me T, "luch hns a gener1l1zed Erlang dtstribUtion wtth param
eters A1 and A- 2 the '-er\ 1ce time 1s exponential w1th parameter p.
(1) Apply 1ng tho met hod of phases, ''71 t e t be eq ua t1ons for the l1mt t1ug
probab1ltt1es of states p 0 , p 1 p 2 , p 3 C"tpress the following chara~
terlstics of the S) stem 1n terms of these prohabiltt1es P ~r' Q, A, z,
r T" tg (2) Calculate the limiting probahiltties and the efficiency char
actertstu~s for the folio'\ 1ng tnlttal data A1 = 3 1 A. 2 = 6 and p. = 4
and compare them 'v1th the character1sttcs for an elementary system
"1th param~ters l = (1/.Ar1 + 1/A 2 )-l = 2 and J-t = 4, the one con
stdf'red tn Pro bIem 11 45
Solut1on t1) If" e consider (as usual) the states of the system num
bered ID accordance "1th the number of customers 1n the system, so
s1 , s2 s3 then the s )-stem \\ 1ll not be 1\1 arkovtan To make 1 t ~[ar ko
'tan '' e shall dr\ Ide the 1nterarr1val t1me T., rather than the servtce
ttm e 1n to t \VO phases (I and I I) T = T 1 + T 2 , ,vhere the random varl
able~ T 1 and T 2 have an exponent tal d1strtbut1on 'v1th parameters At
and .A 2 respectl\ ely
\\ e shall enumerate the s t '1. tes of the s} stem 1n accord a nee \Vl th the
number of cu~tomers present 1n the S.)stem and the number of phases
bet" een the customers
Ch. 11. The Queueing Theory 407

s01 -there are no customers in the system, the interarrival time is


lin the first phase;
s02 -there are no customers in the system, the interarrival time is
in the second phase;
s11 -there is one customer in the system (being served), the inter-
-arrival time is in the first phase;
s12 -there is one customer in the system A-t
(being served), the interarrival time is in the Sot Saz
second phase; .il2
s21 -there are two customers in the system
{one is being served and the other is waiting
Jt
s/1
k
A-t
s,z
"'
for service), the interarrival time is in the first
.12 tr.p
phase; I!
s22 -there are two customers in the system .?.,
{one is being served and the other is waiting Szt 822
for service), the interarrival time is in the second .il2
phase; ,lL
£'
Jl.
s31 -there are three customers in the system ;.,
Sst SJz
{one is being served and two are in the queue),
the interarrival time is in the first phase;
s32 -there are three customers in the system
22/
(one is being served and two are in the queue), Fig. 11.46
the interarrival time is in the second phase.
The directed graph of states of the system is shown in Fig. 11.46.
The equations for the limiting probabilities are
'),1Po1 = ~tpn, A2Po2 = 'AtPot + flP121 ('At + f-t) Pu = 'A2Poz + f.lP21•
(/.,2 + ~t) P12 =+ flP22• ('At + f-t) P'.!.t = A2P12 + ~LPa1,
~"tPu
0•2 + ~t) P22 = + flPa2• ('Al + f.l) P:n = AzPzz + AzPaz,
'A1P21
('Az + f.l) Pa2 = 'A1Pa1·
The normalizing condition is Po + Po2 + Pu + p + p + p +
1 12 21 22
Pat + Pa2 = 1.
It is most convenient to solve these equations by expressing the
probabilities Pii in terms of the last probability, p 32 • The expressions
for the probabilities p 1 i (i = 0, 1, 2, 3; j = 1, 2) have the form
Pa~ = { 1 + ~L+'·2 I 1-l C~-t+'·~ +'·2) + ~t (~t +tot!-l+21.2~L+I.n
2

''1 ,. '·1''2 '·1.1•2


-
I

I
~t 2 (~L 2 + 2At~t+ 2A21-l+ '·i-t-1., 1~2
j "') "
+ '-~)
"'i ~t2
408 Applt~d Probltm$ ln Probability Theory

Returning to the ln1t1al (nonmarkov1an) queueing SJ.Stem ~e ob-


tain Po = Pot + Pot' P1 = P11 + P12 P2 = P~1 + P22 and P3 =
P31 + P32 -z =
Furthermore Pre-r = p 3 Q = 1 - p 3 A - Q"A 1p1 + 2p2 +
3p3 r- tp, + 2p3 tw
= zl~ and tq
= r/'J..
The ltmtting probabllltleS of states are Pc 1 ~ 0 308 p 02 ~ 0 208,
p 11 ~ 0 231 p 12 ~ 0 082 p 11 ~ 0 091 p 22 ~ 0 032 p 31 ~ 0 037 and
P:n ~0011
For the tn1t1al (nonmarkovtan) ~ystem p 0 ~ 0 616 p 1 ~ 0 313,
P2 ~ 0 123 P3 ~ 0 048 Pret ~ 0 048 Q ::::0 952 A ~ 1 904 z ~
-
r o o
o 703 ~ 219 1w ~ 352 and Tq ~ 110 o
Compar1ng these data wtth the results of the precedtng problem 11 45
\Ve Infer that the system considered 1n Problem 11 4.6 has lDSignificant
advantages over that cortsidered 1n Problem 11 45 as concerns all the
effictenc.y character1sttcs and a ltttle gre~ter advantage O\ er an ele
mentary queueing system '' 1th the same A and Jl
11 lJ7 All elementary system u-ithout a queue and Wlth unltmlted assis-
tance between the sert-ers Customers Brr1ve at an n server congestton sys
tern 1n a stationary Potsson process w1th Intenstty ~ The servers asstst
one another 1 e If some co.ervers are Jdlc 'vhen a customer lS being served,
they all ass1st the busy server The Intensity of the process ts a functton
Jl = cp (k) of the number k of ~ervers \vho simultaneouslJ. ser' e h1m
Construct the directed gr il_ ph of states of the s} stem and find 1ts IIDll t1ng
probab 1lt t1es of states Express the folio~ tng effictency characterJstiCS
of the system 10 terms of these probabiltt1es the probab1ltt} of refusal
P ret the relative capac1ty for servtce Q and the average number of
busy servers k Calculate these charactertsttcs for n = 4 h ==== 1 1-" (k) =
kp. \.1 - 0 5 and compare them w1th the same charactertsttc.s tn the
ces-e ~f'Be1.'? ~be 9el"'~ 8~ do- 1.Wt fl''1f~-t U'M ~aa(lrei
Solution 'Vhen the first customer arrtves all n servers begin ser\ tng
h1m Thts means that all the servers al,vays function as a single aer,er
The n server system turns Into a Slngle--"'crver system With refusals
Its states e.re s 0 all the ~er"\ ers are free sn all n servers are busy The
Ch. 11. The Queueing Theory 409'>

marked directed graph of states is shown in Fig. 11.47. Using this graph,.
we get the limiting probabilities of states:

Po={1+<p~n)}- = cp&l~~A,
1

Pt= cp~n) Po= cp(n~+l, •


For cp (n) = n11 we have Po = (n!l)/(n!l + A.),
Pt = IJ(nll + t..).
For n = 4, '). = 1, f.1 = 0.5 we have Po = 2/3, P 1 = 1/3, Pref = Pt =
1/3, Q = 1 - p 1 = 2/3 and A = J.Q = 2/3 ~
0.667.
The average number of busy servers k =
4·1/3 +
0·2/3 = 4/3 ~ 1.333. We get the same
result by dividing A by 11: k = (2/3)/0.5 = 4/3. Fig. 11.47
For the sake of comparison we calculate the
same efficiency characteristics for a four-server system without a mu-
tual assistance between the servers [see Erlang's formulas (11.0. 6) andl
formula (11.0. 7) which follows from them] for p = 'A/fl = 2
Po= {1 + p + p2/2 + p3/6 + p4/24}-1 = 1/7,
Pret = P4 = (2/3) (1/7) = 2/21, A = '). (1 - 2/21) ~ 0.905,
Q = A, k= A/11 ~ 1.81.
Comparing these characteristics with those obtained earlier for-
a system with assistance between the servers, we infer that in our con-
ditions the assistance is not profitable. This is because for a system with·
refusals an unlimited assistance (when all the servers "fall upon" one·
customer while the customers who keep arriving are refused) is never-
profitable.
11.48. An elementary system without a queue and with a uniform mutual'
assistance between the servers. Customers arrive in a process ·with in-
tensity A. at an elementary n-server system with refusals. The servers-
assist one another but not by combining their efforts to form a single-
ser:·er system, as was in the preceding problem, but by the uniform
assistance scheme. Thus if a customer arrives when all n servers are
idle, all of them begin serving him; but if another customer arrives:
whe1_1 the first customer is still being served, some of the servers begin
servmg the new arrival. If another customer arrives whilst the first.
two are being served, some of the servers start serving this ne\v arrival,
and so on till all n servers become busy. If they are all busy, the next
customer to arrive is refused. The function <p (k) = kf.1, i.e. k servers-
sen·e k times quicker than one server.
Co~st:u.ct a marked directed graph of states of the system and find
the hm1tmg probabilities of states and the efficiency characteristics-
Q, A, k for n = 4, '), = 1, ~t=0.5, i.e. on the hypothesis of Prob-
lem 11.47. Compare them with results for the absence of assistance*>.
It ~l In this problem it is irrelevant how many servers begin serving a new arrival.
1
~ ?n1Y relevant that all n servers function all the time and that not a single·
arnvmg customer is refused until n customers have arrived in the system and all n
~en·ers nrc serving them individually.
~10 Applfed Probl~m' tn ProblJbfllty Theory

Solut1on. The states of the S)Stem are enumerated 1n accordance


'"tth the number of customers pre~ent In the system The d1rected
graph of states 1s sho'' n tn F1g 11 4R The graph ts the same as that
for an clement arl one ser' er S) stern 'v1 th ca pact t y J!* = nf.t and a
bounded queue "htch bas n - 1 places To find 1ts characteristiCS v..e
use formul ,s (1 1 0 16) (1 1 0 19} Sett1ng p = p* ==:: ~lp. * = X/(np.} =
() 5t "C ha\C for m = 3
t--p•- 05 ~0514
Po ~ 1- (fl•)~ - 1-0 5~ ,._ ., -(
p~,_ - p
*) 4
Po ,....,0032
,..._. '
A = iw (1 ...... p,) ~ 0 968, Q = 1 -- p 4 ~ 0 908, 1: = At,_..* ~ 1 936
Under the same cond1t1ons (see Problem 11 47) \\ tth no assistance
-
~\ e ha \ c A -;::;; 0 90a Q ~ 0 905, "' ~ 1 8l. 1 e a ''un1form" .a~s1stance

l
So ""'
. ,.
s, - ••• -- ....
~
S~t ..
~
- • • • ..
.
~ Sn
np np np. np np
r.~g t J 48
some'' hat the c1. pact t} of the s} stem for serv1ce In t h1s case
1 ncre,_ qes
the Increase ts Ins1gn1ficant ~•nee the Ioar! on the system lS not 'ery
large
11 49~ 'Ve are g1' en 1.n elemcntarl three ser\ er system 'v1th refusals
and parameters I = 4 cust/m1n The a\erage service ttme of a customer
ser"ed by one ~er\ er j IL -= 0 5 mln the 1ntens1ty oi servlng a customer
by k ser\ ers tp (/.,) = A!l F 1nd the ef ftctenc} character ts ttcs Q, A • I for
three variants (1) no a~s1st ,nee, (2) unl1m1ted asststance and (3) unl·
i orm assistance
Ans\,er (1) Q ~ 0 j!) .~1 ~ 3 16" I ~ 1 58, (2) Q = 0 6, A = 2 4,
~ = 1 2 (3) o ~ o ss1, A ~a st J ~ 1 76 r
11 50 Join elementary system UJJ./h an unbounded queue and asszstance
.betu.een the servers Customers arr1' e a.t an elementary n ser\ er queuerng
systen1 tn a proce~s "1 th 1ntens1ty J..,, the SPrvtce t1me of a customer by
one ser' er betng e>..ponent1al "'1th parameter !J The Intensity of the
~erv1ce process of a customer by 1.. ser\ ers IS pro port tonal to k, 1 e
.rp (k) = kf! The ~er' crs nre arbttrartl, dtstributed over the customers
1n the s}stem but tf at le1.st one customer 1s pre~ent 1n the system. then
all n ti:er' ers are bus J ser\ 1ng b l m
I\ umbertng the states of the system In accordance \Vlth the number
.of customers In 1t, construct a mar lied directed graph of states, find
the ltmtting probahil1t1e~ of states and calculate the effictenc} charac
--....--.-
tertst1cs f... z r, t~, tq
Sol ut1on The directed graph of states of th1s system co1ncades
\Vtth that of an elementary s1ngle server svstem \\1 th "\n unbounded
..queue "'tth the tntenstt} of the arr1 val process A. and servtce process njj
(4i;.ee 11 0, 1 tern 2)
Ch. 11. The Queneing Theory 411

Setting p=x=f..l(n!l) in formulas (11.0.12)-(11.0.15) we get Po=


z r
1 - X , Ph = xk (1 - x) (k = 1, 2, ... ); = x/(1 - x), = x 2/(1 -
x - x2
:x), fw = /..(1-x) and tq = t.. (1-x) •
In this case the efficiency characteristics of the system do not depend
.on whether all the servers serve one customer or whether their services
.are distributed uniformly since no customers are refused (note that in
this case the mean values of the random variables Z, R, Tw, T q do
not vary but their distributions change).
11.51. An elementary system with a bounded queue and uniform mutual
.assistance. Customers arrive at an elementary queueing system with

Fig. 11.51

ll serves and uniform assistance in a stationary Poisson process with


intensity /... The service process rendered by one server is stationary
Poisson's with intensity ~t, while the total service process rendered
by k servers has intensity cp (k) = k!l. The servers are distributed over
the customers uniformly in the sense that every new arrival gets service
if there is a free server. A customer who arrives when all n servers are
busy joins the queue. There are n places in the queue; if they are all
{)Ccupied, the customer is refused.
Numbering the states of the system in accordance with the number
{)f customers in it, construct a marked graph of states of the system and
find 1he limiting probabilities of states. Find the efficiency charac-
teristics of the system Prcf• Q, A, Z:r, t:v, tq.
Solution. The states of the system

are:
so-1 he system is idle;
s1-one customer is being served by all n servers; . . . ;
s11-k customers are being served by alln servers (1 < k < n); . . . ;
Su-11 customers are being served by 11 servers, there is no queue; ... ;
. s,_,_ 1 -n customers are being served by n servers and one customer
1s waiting to be served; ... ;
• sn +m-11 customers are being served by n servers, m customers are
111 the queue.
~he directed graph of states of the system is shown in Fig. 1'1.51.
Tl.us graph coincides with that of an elementary single-server system
With. m places in the queue, the intensity of the arrival process '). and
the Intensity of the service process n~t (see 1'1.0 item 3). Substituting
% = i./(np) for r and n + m for m in formulas (11.0.16)-(11.0.20),
412 Applied Pra'hle m r in Pro bab illt y Theory

1\'"e ob t aJn
p 0 = (1 - x)/(1 - x11 +m+~),. Ph = xhp 0 (k = 1, .... , n + m),
Prer = Pn+rnt Q = 1 - Pn+nH· 4 4 == AQ,
- )I' [t-xn+m (n+m+1-(n+m) x)]
r~ (t -xn+m+a) (1-)t) '
.-. ... ... - ... ...... ~

z= r + k., tw = zi'A. tq ::=; r/i.w


11 ~~52~ The 1nput of the automated data bank rece1ves A:;-+ 335 rec-
ords/h on the averuge The first operatton v-hen a primary documents
arr1 ve 1S to select the records 'vhtch must be fed Jnto the data bank
Set sorters select the records, the average productlvtty of each be1ng
r. == 60 records/h It IS kno\\n that 61 3 per cent of the pr1mary docu·
ments are selected, on the average, from the arrtvals All the procesc:;es
are stattonary Pot~sonts Considering the selection system for the pri-
mary documents to be a Sl"{-!erver system (n = 6) with an unbounded
.......... - .... ~

queue, fi.nd the efflctency charactertst 1cs A, Q, 1... z r, t-w, tq t

Solution S1nce the queue 1s unbounded~ Q = 1 and A A The =


1ntens1ty ot the .arr1val process of primary documents to the data bank
Au = Ap, '\\here p ~ 0 613, 1 e ~ 0 = 335 X 0 613 ~ 205 retords/h
The 1ntcns1ty of the arrival process of those documents '' h1ch "111
not be fed Into the data bank Anot = h (1 - p) ~ 130 records/h.
The a'\erage number of sorters busy w1th seJect1ng the documents
-
k = )w/!1 = p = 5 58 and does not depend on the number of ~er\ers
(sorters) There 1s a stat1onary s1tuat1on 1n the system 1f the cond1tton
x = Al(n~Lr) < 1, 1s fulfilled, 1n our case jt lS fn1filled ("' ~ 0 93)
The proba btlt t y that all n sorters ln the system '" tll be busy [see for-
mula (11 0 22)] IS
Pn=P(n, p)/ ( R(n, p)+P(n, p) 1 ~1<;].
Ustng the tables 1n Appendices 1 and 2, '"e obtain
P8=0.f584t(o 6703+0 1584 ~~~) ~ o 0569.
_The average number of prtmary documents 1n the queue [see (11 0 23)1
~ ~ p..!'x/(1 - x) 2 ~ 10 8 The average queuetng t1me of a document
lq == riA -;:::;: 1 87 mln The il\erage number of documents tn the system
z
(In the queue and betng processed) = r + p ~ 15 38. The average
Waiting time Of a document lw ~ rq
+ 1/J.l ~ 2 87 min
l 1 .53. Demands arr1 ve at the 1npu t of a queue1ng sy.stcm {Ftg 11 53)
In a stat.1onary PoJ.sson process 1v1tb .1ntens1ty .:\ The scrvJce consists
of t\"\Jo successive phases performed 1n system 1 and system 2~ A demand
IS served In system 1 and the quality of the serv1ce 1s checked 1n sys--
tem 2 If no faults 1n the serv1ce are detected by system 2, then th&
demand ts considered to have been served by the queuetng system, 1f
.system 2 detects faults 1n the serv1ee of a demandt the demand IS sent
Ch. 11. The Queueing Theory 4i3

back to system 1 for reservice (see Fig. 11.53). The probability that
a demand served in system 1 will be returned to system 1 for reservice
is 1 - p and does not depend on how many times it has been served
in system 1.
Systems 1 and 2 are n1-channel and n 2-channel systems with unfound-
ed queues and service intensities in each channel of !11 and ~t 2 respective-
ly. The time needed for reservicing a demand in a channel in system 1
and for the retesting the quality of service by a channel of system 2

llaeaemg sysrem
,------------------------~
I
2 :~+ J~ System 1 Jot liz
~
Sgst.em 2
nz,Pz
loz ~'p' 1,10~

I flttJlt \. / I
l Demands sent to system .1 for reservice I
I l
I A. 02 (1-p)
l
~---------------------~
Fig. 11.53

have exponential distributions (the same as when these operations were


carried out for the first time) with parameters ~t 1 and 11 2 respectively.
Find the conditions for a stationary situation in this queueing system,
assuming that the demands arrive at system 1 and system 2 in stationary
Poisson processes.
Solution. We denote the arrival intensity at the input of system 1
by~- It is evident that 1.1 >A. since the input of system 1 receives de-
mands arriving at system 1 for the first time (the intensity of the pro-
cess is A.) plus demands returned for reservicing (see Fig. 11.53). If
a stationary situation exists, then the arrival intensity of demands
served in system 1, 1. 01 , is equal to the intensity 1.1 • The demands served
by system 1 arrive in a flow at system 2 and, consequently, the input
of system 2 receives demands arriving in a flow with intensity 1. 2 =
1'1 = 1. 01 • Since there is a stationary situation in system 2, the intensity
of the departure process of demands from system 2 (/. 02 ) is also 1. 2 •
Thus
(11.53.1)
Evidently, the intensity 1.0 of the departure process of demands which
~lave been served in a queueing system at steady state is equal to the
Intensity A. of the arrival process.
The intensity /., 0 with which the demands are served in the
queueing system is equal to. that at the output of system 2 (1. 02 )
multiplied by the probability p that a demand will not be returned to
system 1 for a repeated ser\·ice:
(11.53.2)
414 Appli~d Probl~ms in Probability Theory

Ao2 = )./p (11 53 3)


Thus I ~ce formulas (11 53 1) (1 t 53 3)1
A..~: = 1 3 =Alp (J J 53 4)
For the queu£nng s~ stcn1 s operatton to be steady state" 1t 1s necessary
tb 1t both system 1 t~nd S) stem 2 should succes~fully handle all the
arr1 v 1 ng demands and, con c:.eq uen tl} , the foil o'' 1ng t" o cond 1t 1ons
ell ou 1d be fu 1filled
x 1 = A-11 (n 1p 1) = Al(pn 1 ~l 1 ) < 1~ (11 53 5)
x2 - "-2.l(n2J.l2) = i../(pn2lt2) < 1 (11 53 6)
They folio\\ from the fact thnt system 1 and SJ-&tem 2 are n 1 channel
and n 2 chonnel systems "1th channel serv1ce lntenstties J.l1 and ~t 3
rr~pect1' el} 1nd '' 1th an unbounded queue (see 1tem 4 at tho begtnn1ng
of th ts cha pler)
11 5ti For the cond1 t1on~ of the prcced1ng problem find the average
"a1l1ng t1me of a den1and and the averag~ number of demands present
1n the S)stem
Solut1on The l\eragc t1mo a demand IS present 1n system 1 for the
first t1mc t; (see F•g- 11 5)} can he found from the cond1tton that de-
m, n ds nrr 1ve "1 t the 1 nput of the system 1 n a stat tonary Po1sson process
\Vlth tntens1 t \ A. 1 = 'Alp the number of channels ts n1 , the Intensity
of serv1ce ts f-1 1 and the number of places 1n the queue IS unl1m1ted
In aceordance "1 tl1 (J j 0 2.1} (11 0 25) we ba\ e for these condJtlons

(11 54 1)
where
v _ Pt 1 _ ).
AI - nl ' 1\.t - p'
nl'
p,-z +l

nl ntl
+ P "~
1
1
1-}(l
+

---:-- --- J-1
'\
By analogy "e calculate the vartable t~" h1ch 1s the a' erage ttme
a demand IS present tn S) stem 2 for the first time for the condtt1ons
la = 'Alp n 2 and p. 2
( 11 54 2)
where

Poz = [ 1 + fi + ~r + t
1-x
]-1
------------------------------~C~h~l~l~·~T~he~Q~n~eu~e~i~ng~T~h~e~or~y---~~· •

Consequently, the average time needed for a once through service


· a demand by system 1 and system 2
• - -
't12 = tl + t2. (11.54.3)
It follows from problem 11.53 that a random variable X, defined as
the number of times a demand is served by system 1 and system 2,
has a geometric distribution beginning with unity with parameter p:
·. 1 I2 I 3 I ··· I I ···
X k ( 11.54 .4}·
P Ipq l pq2 1·· · Ipqh-l I ·· · .
We denote the time of the first, second, ... , the kth cycle of pro-
cessingademandinsystem1 andsystem2by T1 , T 2 , • • • , Th, ....
By the hypothesis the random variables T1 , . • . , T h, . • • are mutu-
-
ally independent, have the same distribution and the mean value • 12 •
We can write an expression for the time a demand remains in the
queueing system (with due regard for the possibility that a demand is
X
returned for reservice) in the form T = Lj Tk, i.e. the sum of a random
k=1
number of random terms, where the number of terms X does not depend
on the random variables T 1 , T 2 , T 3 , • • • • In accordance with the-
solution of Problem 7.64 we find the mean value of the variable T:
(11.54.5)·
Little's formula is also valid for the queueing system we have con-
sidered, and, therefore, the average number of demands present in the
system can be found from the formula
- -tl..
z = (11.54.6)
11.55. The conditions of Problem 11.53 are changed so that in both
system 1 and system 2 a demand is served for the first time and the-
quality of service is checked. If a demand does not pass the check in
system 2, it is sent to system 3 for a reservice and to system 4 for re-
checking (Fig. 11.55). System 3 and system 4 are n 3-channel and n 4-chan-
~el systems with exponential distributions of service time of demands
~~~he channels and with parameters ~t 3 and t-t 4 respectively. The proba-
btht~· that a demand processed in system 3 will be returned to system 3
after the check in system 4 is 1 - n. Find the conditions for steady-state
operation of this queueing system, assuming that the demands arrive
at srstem i' system 2, system 3 and system 4 in processes which are
slatzonarr Poisson's.
hSolution: \Vhen there is a stationary situation in the queueing system,
I e followmg relationship is obvious (see Fig. 11.55):
"A= At == Aot = 'A2 = Aoz· (11.55.1)
~111;equently, the intensity of the departure process of demands which
la\'e been served for the first time and checked in systems 1 and 2 is l.p.
416 Applied Problemr In Probablltty Theorv
-----------------------
7he Intenstty of the process 1n whtch the demands that have not passe{
the check 1n system 2 arr1\e for reserving at system 3 and system 4 t:
Jw (1 - p) The operat1on of the reserv1ce system (system 3 and sys
tern 4) (from po 1nt 1 to po 1n t 2 In F1g. 11 55) does not d1 Her In prrnca pl1
lfrom the op~rat1on of the system considered tn Problem 11.53

I J, System 1 A. at J'2 Sgsttm 2 02 - 1


·-.
..
.; rp\
n Pt nJJiz
l \....,,...'
.tp I
I ~ I
! (I?) J)~Mards u h ch have nJt passed//;~ rhe-clr u, $f.!ltn 2 IJo
l
{ +A{l
I
--
p)
!
I
4 -.
!
l+ . -;
I "'J.J S;s!tn J
n!PJ
4.oJ ,~,

...
~
S.rst~w 4
n, p_.
Ao~-
,. "''
~:r
'-..I 2. r

I
rt
I
Demands sent for rrse,.vtre
i
- - _........._ _________ j

Ftg tf 55

Thus the follo,vtng condtttons must be fulfilled s1multaneously fo:


1.he stattonat} mode of operatton of the system consrdered 1n tht
problem
~ 1
x2 = n:t:Jli < ,.
)( = i..(i-p} < 1 (11 55 2
' ttnllt
1n add1tton, the fol1owtng equalities v, 1ll hold true (see Flg. 1f 55)
"-s = Ae 3 = A- 4 = X04 = A (1 - p) n-1 • (11 55 B
11 56 For the condtt1ons of the precedtna- problem find the averag
- 0 '
wa1t1ng t1me t of a demand and the average number of demands
present In the system
Solution. The average \Valtlng t1me of a demand can be found fror
the condition that demands arr1ve at the tnput of the system In a sta
tJonary PoJ.ssan process w1th 1ntensxty i\1 ==A, Csee F1g 1.1 55 and fo1
mula (11 55 1)1, the number of channels ts n 1 , the 1n tenstty of serv1ce t:
a cbannel 1s f.1 1, and the number of places 1n the queue IS unl1m1tea
1 n accordance 'vi th (11 0 21) (11 0 25), we have for these eond1 ttons

(11.56.i
Ch. 11. Tlze Queueing Theory 417

vhere

1
1-x1
J-1 •

3y analogy we calculate the variable t 2 for system 2 at A- 2 = A., n 2 and

(11.56.2}

vhere

1
1-'Xz
J-1 •

rhe average waiting time of a demand in system 1 and system 2

( 11.56.3)
The average time fa a demand is present in system 3 for a once through
:ervice can be found by assuming that demands arrive at the input of
,his system in a stationary Poisson process with intensity A. 3 =
'· (1 - p)ln [see Fig. 11.55 and formula (11.55.3)], the number of
:hannels is n 3 , the intensity of service in a channel is f.t 3 , and the num-
~er of places in the queue is unlimited. Then, in accordance with
(11.0.21)-(11.0.25), we have
-t3 = (Pa + ra) '-,~ 1
= 1hta + Pr + 1Posllnsns! (1
3 - Xa) 2 Aa1, (11.56.4)
where
Aa i..(i-p) X _ P1
P3 = "r3 = :rtu
• 3
' '3 - n3

Po3 = [ 1
Pa
+ + 21 + ·· ·+ +
1!
P~ pn•
n3!
p~•+1
n 3n3 l
__1_]-1.
i-x 3

By analogy we calculate the variable t 4 for system 4 at /, 4 = A. (1 - p)ln,


n~, ~1 4 ;

t, = (p 4 + ~) '-4 1
= 11~-t. + pZ'+ Po.l[n.n) ( 1- x~) 2 A.d, (11.56.5)
1

17-osn.
.

Con~equent1), the a\croge ttme per demand processed onc:e 1n S) stem 3 ~


and ~s ~tern 4
-ul -
T3' = t
-
+t, (J 1 56 6}
3

In accordance '' 1th formula (11 54 6)1 the expectation of the waiting;
ttme of a demand 10 !ystem 3 and S}Stem 4 (see Ftg 11 56) . Wlth due
regard for possible reprocessing, lS
- .-(1)
'f:u. === 't 34 l.n • (11 56.7}
To find the average "ntltng t1mc of a demand 1n the queuetng
system, '' e constder t'\\ o hypotheses (f) //1 , the demand "as only
processed once P (/! l) = p and (2) 112 't the demand v. as procesc:~d
many t1mes, P (JJ 2) ~ 1 - p
On the bypotbesJs //1 the e"tpectatzon of the naJtJng t1me of a demand
tn the queuetng S)'Stem can be found from formula (1 t 56 3), on the
hypothesis If 2 the expectatton can be found from formulas (11 56 7)
and {11 56 3) The complete expectatton of the "a1t1ng t1me of a de-
mand 1n the system
(11.56 8)
erage number of demands pre~ent 1n the system can be found
formula

I
- -t.A.
z ~ (1 :1.56.. 9)
r"Let us conslder the formulntJon of Prob1em 11 53, as app]Jed
multi fold proce~stng of tnformatton 1n t" o phaces To check
l= an Input card has b~en correctly punched., the punchtng 1s per-..
formed t ~ICe on a punch un1t and on a ver1fier, "h1ch can detect an
error The punrh un•t operator mahes one error per 1000 symbols on
the a' er'lge Each punch card conta1ns an a\ erage of 80 symbo1s lVhen
at least one error IS detected on a punch card the card ts returned for
a repunch1ng It 1s necessary to proce~s 50 000 documents 10 a )ear,
each of '" h1 ch cont a1n.s an a\ e.rage o1 400 symbols
F1nd the mllllmUm po~slble number of punch untt and ver1fier oper-
ators needed t! one punch unit operator punches 4 2 X 106 symbols
a '~ar and find the characteristics of the system
Solutton It follo'' s from the cond1t1ons of the problem that an
a'\-erage of 50 000 X 400180 = 250 000 cards must be punched and
ver1fied per lear Consequently ~ = 250 0 00 cards/year = 125 card.s/h =
2 08~ The Jlroduct., tty of a punch untt or verifier operator
1s "':/80 = 52 x 5000 cards/) ear = 26 25 cardslh =
() 1 bah1l1ty th1.t no errors wtll be found on a
::::: 0 9 In this ca~a "c neg1ect the probabJ}It)
and the vertfler operators "1ll err a.t the-
l is much smaller than (0 001) 2 = 10 6
' l t and the ver1f1er has about 50 sym
1
he Introduced by press1ng any one
Ch. 11. The Queuein g Theory 419

keys, then '' e find that the probab ility of the same error is
If .we assu.me that an error can only be introdu ced by pres-
neighb ourmg the correct symbol (of which there are from
ht), then the probab ility is 1Q-6f25-10-Bf64.
ionary operati on of the punch- unit operato rs [see (11.53.5),

A./(nif.t:tP) = X1 < 1, M(n2ft 2 p} = x2 < 1,


> 'A/(f.t1P) = 250 000/(52 X 500 X 0.9) = 5.29 and n 2 > 5.29.
e must have six punch- unit operato rs and six verifier oper-

aracter istics of the operati on for the punch- unit operato rs:
p = 250 000/0.9 = 278 000 cards/y ear, n1 = 6, f.tr = 52 X
;/year, x 1 = 'A/(n1pf.t) = 0.882 andJ p1 = 'A!(pf.t 1 ) = 5,29.
3 formul a (11.0.22) to find the probab ility that all the six
tit operato rs are busy and there is no queue of documents
ust he; punche d:
_ _ _ _P_(~n-Lt~~PtO!...)- - - = ---~0•..::.:15:.::2.....,..0."""88"""2,....._ = 0.107.
R (n 1 , p1 )+P {n1 , p1 ) i~x
1
0.282+0 .152 1 _ 0 •882

1erage numbe r of cards waiting to be punche d 11 = Pn,x1 /(1 -


. 78.
- -
verage queuei ng time of a card tq = r1 p'A -l = 2,92 min and
+ 1/f.t1 = 5.21 min.
- - - -
otal numbe r of cards in the first phase z1 = r 1 + ~ = r 1 +
.07.
the charact eristics of the second phase (verification) are the
those of the first phase, we can write the general charact eristics
)perati on of the system with due regard for the possibi lity that
ay be returne d for repunc hing. The overall average numbe r of
- -
·esent in the system z = 2z1 = 24.14 and the number of these
1 the queue r = 2r1 = 13.56. The overall average process ing
1r card, with due accoun t of the possibl e return of a card for
;sing, T = ~ 2 /p = (t 1 +
~)/p = 2£;/p = 11.57 min.
AppendJces

APPENDI:\ t

Potsson"s Drstrtbutzon P (m, a)- ::; e·a

0 0 8
0 1 D2 0 3 .0 ' 0 5 0 7 0I 0 9
"' ..
--"' II!
0 ()
tl~ 8181 0 1403 0 6703 o r,oes 0 MSS. 0 4966. o «93lo 400&
i 0 0005 0 1638 0 2222 0 2681 0 3033 0 3293 0 3476 0 3595 0 3659
2 0 Ol)'t5 0 0184 00333 0 0536 0 0758 0 0988.0 t2t7 0 1438 0 t647
3 0 0002 a oot9 0 0033 0 0072 0 Ot26 0 Of9S 0 0284 0 0383 0 CH94
4 0 OCOl 0 0002 0 0007 0 OOj6 0 003f) 0 0050 0 0077 0 OJti
5 0 0001 0 0002 0 0001 0 0007 0 0012 0 0020
6 0 0001 0 0002 0 0003
- 0
I

~ 6 g to
"' ' 2 3
" 7 8

0 o 3&79 o t353 o ni9S o ot83 o ooo1 o oozsjo 0009 o 0003 o ooot o 0000
t 0 3679 0 2707 0 14M 0 0733 0 0337 0 0149 0 0064 0 0027 0 OOif 0 00'15
2 0 f839 0 2707 0 2240 0 f 465 0 0~2 0 01-48 D 0223 0 0 J07 0 0050 0 0023
3 0 0513 0 1804 0 2240 0 t954 0 1W\ 0 0892 0 0521 0 0285 0 0150 0 0076
4 0 Ot53 0 0902 0 1680 0 i 9M 0 t 755 0 t339 0 0912 0 0572 0 0337 0 0189
0 00310 0361 0 iOOS 0 1563 0 1755 0 1606 0 t277 0 0916 0 0507 0 0378 I
5
6 o ()(X)5 o ot20 o oso4 o 1042 a i462 o f606 a t400 o f22-t o oot-t :a OB3t
7 0 OOOJ 0 0037 0 02j6 0 0595 0 1044 0 t377 0 t490 0 j39S 0 t17i 0 0901.
8 0 0009 0 OOS i 0 0208 0 0653 0 t033 0 1304 0 13!16 0 13 i 8 0 t i.26
9 0 0002 0 0027 0 0132 0 0363 0 0689 0 i014 0 124.\ 0 1318 0 1251
ta o ooog o ooss o oist a 01i3 a orta a WJ3 u -ii86 a tMf
tJ 0 0002 0 0019 0 0002 0 0225 0 0~52 0 0722 0 0970 0 t 137
12 0 0001 0 l)J060 0034 0 0126 0 026 30 048t 0 07280 0948
13 0 t)(K)2 0 0013 0 0052 0 0142 0 0296 0 0504. 0 0729
t4 0 0001 0 000 50 0022 0 0071 0 0169 0 03240 0521
f5 fa 0002 (j 00090 0033 0 0000 (J 0194 0 0047
tG 0 0003 0 0014 0 0045 0 01090 0211
t7 0 000 10 0006 0 0021 0 00580 Oi28
18 0 0002 0 0009 0 0029 0 0071
i9 Kl OOOf 0 0004 0 00!4 0 2237 l
1D \) t'C02 0 ~ Q. 00l9
21 0 0001 0 0003 0 0009
22 0 (0)1 0 0004
23 0 tm2
0 OOOi
Appendices 421

A P P E N D IX 2
m

l- R ( m ,a ) = l- ~ - . , . - ~ -a
Probabilities*>R(m, a ) =
8
O h=
'

a = 0. 3 a = 0. 4 a = 0.5
m a = 0. 1 a = 0. 2

3.2968-1 8.9347-1
0 9.5163-2 1.8127-1 2.5918-1
9.0204-2
1.7523-2 3.6936-2 6.1552-2
1 4.6788-S 7.9263-3 1.4388
1.1485-3 3.5995-S
2 1.5465-~
2.6581-4 7. 7625-4 1. 7516-3
3 3.8468-6 5.6840-6
1.5785-r, 6 .1243-r, 1.7212-4
4 2.2592- 6 1.4165-r,
4.0427- 6
5 1.0024-6
6

a = 0. 8 a = 0. 9
m a = 0. 6 a = 0. 7

5.5067-1 5.9343-1
0 4.5119-1 5.0341-1 2.2752
1.5580 1.9121
1 1.2190
4.7423-2 6.2857-2
2 2.3115-2 3.4142-2 1.3459
3 3.3581-3 5. 7535-3 9.0799-3
1.4113 2.3441-3
4 3.9449-4 7.8554- 4 3.4349-4
3.8856-o 9.0026-6 1.8434-4
5 4.3401-r,
6 3.2931-6 8.8836- 6 2.0747-6
2.0502-6 4.8172-G
7
a=5
a=2 a=3
m a =1

9.8168-1 9.9326-1
0 6.3212-1 8.6466-1 9.5021-1 9.5957
8.0085 9.0842
1 2.6424 5.9399 7.6190 8.7535
2 8.0301-2 3.2332 5.7681 7.3497
3.5277 5.6653
3 1.8988 1.4288 5.5951
1.8474 3. 7116
4 3.6598-3 5.2653-2 2.1487 3.8404
5 5.9418-4 1.6564 8.3918- 2 2.3782
1.1067
3.3509 1.3337
6 8.3241-o 4.5338- 3 5.1134- 2
7 1.0219 1.0967 1.1905
2.1363 6.8094-2
8 1.1252-6 2.3745-4 3.8030-3

of R(m, a) as fo llo w s: P (m , a) =
*) P( m , a )= am e- a ca n be fo un d in te rm s
- m!
R ( m - 1, a )- R (m , a) ( m > 0) ,
- p ( 0, a)=1 - R- ( 0. a) •
=

Contln,ued

m i a~l
f a~a~2
f .QCII3
f tJ.~i::a;f.
f Q~5

9 li ~98-1 1. i025 8 t32z-s 3 1828


10 8 3082-6 2 923.'1-~ 2 8399 1 3695
11 1 3646 7 t387-f. 0 1523-' 5 4531•3
i2 i 6149 2 7372 2 0189
t3 3 4019-'$ 7 6328- 1 6 9799~

1'15 t 9932
4 8926- 4
2 2625
6 OOJS- 6
16 i 1328 t 9S69
17 5 4163-'
t8 1 4017
m
l o: ..... 6
t (J;c=,
I ai!;I;;I,B , a ... 9
I Cl !!!a 10

0 9 9752-l 9 990~-l 9 995(}-l 9 99S9- 1 9 !)995-1


l 9 82G5 9 9270 9 9G9g 9 9377 9 9950
2 9 3800 9 7036 9 8825 9 9317 9 9123
3 84880 9 t823 9 5762 9 7877 9 8966
4 1 1.494 8 ~1~\ 9 0~31 9 ~~~ 9 70.15
5 S M32 6 9929 8 0376 8 8\31 9 3291
6 3 937() 5 5029 68663 1 9322 8 6938
1 2 5002 4 0129 5 470i 6 7610 7 7978
8 t 5276 2 70[)1 4 0745 5 4~35 6 6718
9 8 3924-~ 1 69~0-l 2 8333-l 4 t259 l 5 4207•1
10 4 26:!1 9 8521- 2 1 8411 2 9t01 4 109~
ff 2 0092 5 3350 1 1192 t 9o99 3 0322
12 8 8275-:S 27000 6 3797-1 i 2\23 2 054-1
~3 3 6285 f 281 I 3 4181 7 33'51-2 f 355~
14 1 4004 5 7172-1 1 7257 4 1~56 8 3-LSS I
iS 5 09iQ- 4 2 4036 8 231Q-3 2 2035 4 874.0
16 1 7488 9 5St8-,l 3 7180 t ttOB 2 70-i2
t7 5 8917-1 3 6178 i 5943 s 3J!Jli_, t 4278
18 1 7597 t 2935 6 5037..., 2 4251: 7 1865-1
m I a~e Q5:1 ( o.==S t1~9
! nt:a!O

19 I 5 1802-1
I

4 4402-1 2 5291 ' 1 0360 3 45l3


20 t 4551 1 4495 9 3968-6 4 3925-l 1 5'38.3
21 4 5263-e a 3407 1 7495 6 996.5 I
22 1. 3543 1 f385 6 agzg-s 2 9574
23 3 7255... 8 2 4519 1 2012
24 2 J722 8 6531-& 4 £919-a
25 2 9414 1 7690
26 6 4229-8
27 2 2535
A pp en di ce s 423

th at an ev en t A w il l oc cur no m ore
ity
Ex:am:Ple•. We must find the probabil
than twtee If a = 7.
We have 1 03 6 = 0. 0296 4.
36 - = 1 - 0. 97
R (2, 7) = 1 - R - (2, 7) = 1 - 9. 70
ex po ne nt , then th e ex po ne nt of th e
ma rk . If a nu m be r in the ta bl e has no
~e 1.
Fo r ex am pl e R (33 19) -
ber in the column is the ex po ne nt • , -
previous num ·
i.2067 X 10-3, n be ap pr ox im at ed by
, a) ca
2. For a > 20 the p ro b ab il it y R (m

<I> ( m + ~~-a ) + 0 .5 ,
R (m, a)~
1
th e er ro r fu nc ti on (Appendix 5).
where <D {x) is

a = 14 a = 15
m a = 11 a = 12 a = 13

0 9.9998-1 9.9999-1
9.9997-1 9. 9999-1
1 9.9980 9.9992 9.9996-1
9.9978 9.9991
2 9.9879 9.9948 9.9979
9.9895 9.9953
3 9.9508 9.9771
9.9819 9.9914
4 9.8490 9.9240 9.9626 9.9721
9.8927 9.9447
5 9.6248 9.7966 9.9237
9. 7411 9.8577
6 9.2139 9.5418 9.6838 9,8200
7 8.5681 9.1050 9.4597 9.6255
9.0024 9.3794
8 7.6801 8.4497
8.9060 9.3015
9 6.5949 7.5761 8.3419 8.8154
7.4832 8.2432
10 5.4011 6.5277 7.3996 8.1525
11 4.2073 5.3840 6.4684 7.3239
5.3690 6.4154
12 3.1130 4.2403 5.3555 6.3678
13 2.1871 3.1846 4.2696
4.2956 5.3435
14 2.2798 3.2487 4.3191
1.4596 3.3064
15 9.2604-2 1.5558 2.3639 3.3588
1.6451 2.4408
16 5.5924 1.0129 1.7280 2.5114
17 3.2191 6.2966- 2 1.0954 1.8053
6.9833-2 1.1736
18 1.7687 3.7416
7.6505-2 1.2478
19 9.2895-3 2.1280 4.2669 8.2972-3
2.5012 4.7908
20 4.6711 1.1598 2.8844 5.3106
21 6.0651-3 1.4081 3.2744
2.2519 1.6712
22 3.0474 7.6225-3 1.9465
1.0423 ll.3276- 3
23 4.6386-4 1.4729 3.9718
5.0199 1.1l65
24 6.8563- 4 1.9943 6.1849-3
1.9371 2.6076
25 3.0776 9.6603-4 3.3119
8.2050-5 1.3087
26 1.3335 4.5190 1. 7158
3.2693
2.0435 6.3513-4
27 5.5836-5 8.6072-~
1.2584 8.9416- 5 2.9837
28 4.6847-6 2.2616
424 Appendlct•

a 11 a~ 12 g l3 a:: l l a.~ 15
"'
1i;l;;r;
tc

29 t 6882 8 870t-l 3 7894 t 3580 4.1843


so 3 3716 t 5568 5 9928-1 1 9731
3t t 2.4.32 6 2052-" 2 5665 9 0312-1
32 2 4017 1 0675 4 0155
33 4 3154-l i 7356
34 1 6968 7 2978-4
35 z 987f
1 1910
36
I
tn Q 11m. h3 a~ t1 a~ 18 a- 19 a E: 20

0
t
2 g {iggS-l g ggog-t
3 9 9!191 9 $)996 9 9098-1 9 9999-1
4 9 9900 9 9982 9 9992 9 9996 9 9998-l
5 9 9862 9 9933 9 9V68 9 9985 9 9993
6 9 9599 9 9794 9 9896 9 9948 g 9974
I
7 9 9{)0Q 9 9457 9 9711 9 9849 9 9922
8 9 7801 9 8740 9 9294 9 9613 9 9791
9 0 5S70 S' 7388 9 8482 9 91f4 9 9500
10 9 2260 9 5088 9 6963 9 81G8 9 8919
11 8 7~01 9 1533 9 45lt 9 6533 9 7861
12 8 0688 8 6498 9 0833 9 3944 9 0099
13 7 2S45 7 W13 s 5140 g 0150 g ~357
14 6 3247 7 1917 7 9192 8 5025 8 951~
15 5 3326 6 2855 7 1335 7 8521 8 4.349
16 4 3404 5 3226 6 2495 7 0797 7 7893
17 3 40CG 4 3598 5 3135 6 2164 7 0297
18 2 5765 3 4504 4 3776 5 3052 6 i858
19 1.8775 2 6368 3 4908 4 3939 5 2974
20 1 3183 1 9452 2 6928 3 5283 4 409t
21 8 9227-2 1 3853 2 0088 2 7450 3 5630
22 5 8241. 9 5272.-1 t 4491 2 0687 2 7939
23 3 6686 6 3296 1 0111 t 5098 2 t251
24 2. 2315 4 {)646 s 82ro-' 1 0075 1 5671
2S 1 31t9 2 52 ..5 4 4608 7 3126-~ 1 i218
26 7 ~589-3 1 5114 2 8234 4 8557 7 7887-t
27 4 1051 8 8335-3 1 7318 3 i268 5 2481
28 2 18E6 4 9838 t 0300 1 9536 3 4334
Appendices 42&

Continued

m a= 16 a= 17 a= 18 a= 19 a=20

29 1.1312 2.7272 5.9443-3 1.1850 2.1818


30 5.6726-4 1.4484 3.3308 6.9819-3 1.3475
31 2.7620 7.4708-4 1.8133 3.9982 8.0918-3
32 1.3067 3.7453 9.5975-4 2.2267 4.7274
33 6.0108-5 1.8260 4.9416 1.2067 2.6884
34 2.6903 8.6644- 6 2.4767 6.3674-4 1.4890
35 1.1724 4.0035 1.2090 3.2732 8.0366-4
36 4.9772-6 1.8025 5.7519-6 1.6401 4.2290
37 2.0599 7.9123-6 2.6684 8.0154- 5 2.1708
38 3.3882 1.2078 3.8224 1.0875
39 1.4162 5.3365- 6 1.7797 5.3202-6
40 2.3030 8.0940-6 2.5426
41 3.5975 1.1877
42 1.5634 5.4252- 6
43 2.4243
44 1.0603
\PP£1\D IX 3

Tile l'alu.os of tlze Fttn ,.tton e-x


:.t \ e-% \ a ~ X \ ,--': ' '
b :1; ' f:J-,; jt A ~ X.
' f!~ t.

000 1 000 to 0 40 0 670 7 080 () 4~9 ~ 300 0050 5


0 Ot 0990 10 0 41 0 6G4 7 0 81 0445 5 3 to 0 Oi5 4
02 930 10 42 657 7 82 440 4 320 41 4
6SO 436 37
03
0-t
970
981
9
10
43
4.~ 6!-i
6
6
83
84 412 "
5
330
340 33
30
4
3
05 951 9 45 63g 7 85 427 4 350 3
Ofi. 9!2 to 46 631 6 86 423 4 360 27 2
07 932 9 47 625 6 87 419 4 3 70 25 3
09 923 9 49 Gt9 6 gg 415 4 380 22 2
09 914. 0 49 613 7 80 411 4 390 20 2
0 t{J 0 005 9 0 50 0 606 6 090 0 407 4 4 00 Q 0183 f7
tt 896 9 51 6QC) 5 91 403 ~ 4 to 166 16
t2 837 9 52 595 G 92 399 4 4 20 150 14
t) 878 9 53 599 6 93 395 4 410 t36 13
~~ 869 8 5J 533 & 94 391 4 4 40 123 J2
15 861 {} 55 577 6 95 3~7 4 4SO 111 to
t& 8;:.~2 8 56 571 6 96 333 4 460 101 10
17 84\ 9 57 565 5 97 379 4 4 70 0 0091 9
18 835 8 53 560 6 93 375 3 4 80 82 8
19 827 8 59 55i 5 99 372 4 4 90 74 7
020 0 819 8 0 60 0 5t9 6 100 0 363 35 5 00 0 0067 6
21 811 8 61 5.\3 5.. 1 tO 333 31 s to 61 6
22 803 8 62 533 5 t 20 302 29 5 20 55 5
23 ..,g.J M 63 533 6 1 3() 273 26 5 30 so 5
Zi 787 B 6~ 527 5 1 40 2i7 2~ 5 40 45 4
25 779 8 65 522 5 t 50 223 21 5 50 41 4
26 771 8 66 517 5 iSO 202 i9 560 37 4
27 763 7 67 512 5 1 70 iS3 18 5 70 33 3
2g 756 8 63 507 5 t so 165 15 580 30 3
29 7..3 7 69 50:! 5 190 150 15 5 90 27 2
0 30 0 7'jt 8 0 70 0 497 5 200 0 135 13 600 0 0025 3
31 733 7 71 492 5 2 10 122 tt 6 10 22 2
32 726 1 72 437 5 2 20 111 11 6 20 20 2
33 7J9 7 73 4S2 5 230 100 9 630 JB J
3~ 712 7 74 477 5 2 40 0 091 9 6 40 17 2
35 705 7 75 472 4 2 5J 82 8 6 50 15 1
38 693 7 76 4:6S 5 200 14 7 £ 60 t~ 2
37 691 7 77 463 5 2 70 67 G 6 70 12 1
33 68~ 7 78 459 4 280 61 6 680 it t
39 677 7
.....
79 45~ 5 2 9[) 55 5 6 90 10 1
0 4:0 0 67()
~ 080 0 449 3 OJ 0 QjQ 7 00 0 0009
Append ices 427

APPEN DIX 4
The Densi ty of theN onnal Distribution
1 x2
f (x)= l 2n e-2
X
I 0 I 1 I 2 I 3 I 4 I 5 I 6 I 7 I 8 I 9

' 0.0 0.3939 31)89 3939 3938 3936 39% 3932 3930 3977 3973
I
' 0 .1 3970 3965 3961 3956 3951 3945 3939 3932 3925 3918
0.2 3910 3902 3894 3885 3876 3867 3357 3347 3836 3825
' 0.3
I
3814 3802 3790 3778 3765 3752 3739 3726 3712 3697
' 0.4
I
3683 3668 3653 3637 3621 3605 3589 3572 3555 3533
0.5 3521 3503 3485 3467 3448 3429 3410 3391 3372 3352
0.6 3332 3312 3292 3271 3251 323) 3209 3187 3166 3144
' 0.7 3123 3101 3079 3056 3034 3011 2939 2966 2943 2920
0.8 2397 2374 2350 2327 2803 2780 2756 2732 2709 2685
0.9 2661 2637 2613 2539 2565 2541 2516 2492 2468 2444
1.0 0.2420 2396 2371 2347 2323 2299 2275 2251 2227 2203
1.1 2179 2155 2131 2107 2v33 2059 2036 2012 1939 1965
I 1.2 1942 1919 1895 1872 1349 1826 1804 1781 1753 1736
1.3 1714 1691 1669 1647 1626 1604 1582 1561 1539 1518
' 1.4 1497 1476 1456 1435 1415 1394 1374 1354 1334 1315
1.5 1295 '1276 1257 1238 1219 120J 1182 1163 1145 1127
1.6 1109 1092 1074 1057 1040 1023 1006 0339 0973 0957
1.7 0940 0925 0909 0393 087o 0363 0848 0333 0318 0304
1.8 0790 0775 0761 0748 0734 0721 0707 0694 0381 0669
1.9 0656 0644 0632 0620 0603 0596 0584 0573 0562 0551
2.0 0.0540 0529 0519 0508 0498 0438 0478 0463 0459 0449
2.1 0440 0431 0422 0113 0404 0396 0337 0379 0371 0363
2.2 0355 0347 0339 0332 0325 0317 0310 0303 0297 0290
2.3 0283 0277 0270 0264 0253 0252 0246 0241 0235 0229
2.4 0224 0219 0213 02()8 0203 0193 0194 0189 0184 0180
2.5 0175 0171 0167 0163 0158 0154 0151 0147 0143 0139
2.6 0136 0132 0129 0126 0122 0119 0116 0113 0110 0107
2.7 0104 0101 OJ99 0)96 OJ93 0091 0()38 OJ36 0084 0081
2.8 0079 0077 0075 0073 OJ71 0069 0()37 0065 0063 0061
2.9 0060 0058 0056 OJ55 OJ 53 OJ 51 OJ50 0043 oon 0046
3.0 0.0044 0043 0042 0040 0039 0038 OJ37 0036 0035 0034
3.1 0033 003?. OJ31 0030 0029 0028 OJ27 OJ26 0025 0025
3.2 0024 0023 0022 0:)22 OJ21 0020 OJ20 0019 OJ18 0018
3.3 0017 0017 0016 0016 0015 0()15 OJ14 0()14 0013 0013
3.4 OJ12 0012 0012 0011 OJ11 OJ10 0010 OJ10 OOJ9 0009
0008 0008 0003 OJ08 00()7 OOJ7 OJJ7 00()7 0003
3.5 0009
3.6 0006 0006 0006 OJ05 0005 0))5 OJJ5 0005 0005 0004
3.7 0004 0)04 OOJ4 0004 OJJ4 00()3 OJJ3 OOJ3 0003
OOJ!
OJJ3 0003 0003 0002 0()[)2 OJJ2 0002 0002
3.8 0003 0003
0002 0002 0002 0002 0002 0002 0001 0001
3.9 0002 0002
428 Apptntllc~r

APPENDIX 5
~
1 \ ,1
The Values of the Error FttlzctroJl <D(x) ~ 12 J ~ -r dz
J no

6 8
0 t 2
'
00 0 0000 00399 00798 01.197 01595 01994. 02392 02790 03188 03586
0 t 03983 04380 04.776 05172 05567 05!)62 06356 ()6749 07142 07535
0 2 0792G 08317 08706 00095 09483 00871 10257 10042 11026 it409 ~
0 3 11791 12172 12552 i2930 i8307 13683 14058 14431 t4803 t5173
04 15542 t5910 10276 16640 17003 1.7364 17724 iE082 18439 18793
0 5 19146 10497 19847 20194 20540 20884 21226 21566 219M. 22240
0 6 2~575 22907 23237 23565 23891 24215 24S37 24857 251;5 25490
0 i 25804 20.115 26424 26:730 27035 27337 27637 27935 2.8230 28524
0 8 28814 29i03 29389 29673 29955 20234 30511 30785 31057 31327
0 9 31594 318S9 32121 32381 32639 32894 33147 33398 33646 33891
i 0 34134 34375 3461, 34850 35083 353t4 3S543 35769 35993 36214
i i. 36433 356SO 368M 37076 37286 37493 37698 37000 38100 38298
1 2 38493 38686 38877 39005 39251 39435 30617 39796 SS973 40147
13 40320 40490 40658 40824 40988 41149 4i309 41466 41621 41774
t 4 41924 42073 42220 42364 42507 42S47 42786 -42922 43056 43189
15 43319 43448 43574 43699 43822 43943 44062 4.4179 44295 44409
16 44520 44630 44738 44M5 44950 45053 ~5154 4.5254 45352 45449
f 7 ~5543 45837 45;28 45818 45907 459!)4 4.£0EO 46164 46246 4S327
18 46407 46485 40.562 48638 46712 46784 46856 46926 46995 47062
1 9 47128 47193 47257 47320 47381 47441 47500 47558 4.76t5 4.7670
20 -47725 4t778 47831 47882 47932 47982 4~030 48077 48i24 48169
2 i 48214 48257 4.8300 48341 48382 48422 484.61 48500 48537 48574
2 2 48610 48645 '8G79 487t3 48745 48778 4SE09 4B840 48870 48899
2 3 48928 48956 48983 49010 49036 49061 490S6 49111 49i34 4g158
24 49180 49202 49224 4924S 49266 492Eo 49305 49324 49;343 4.9361
2 5 4.9379 49396 49413 49430 49446 49461 49477 49492 49506 49520
2 6 49534 4.9547 49560 49573 49585 49598 4.9609 4.9621 49632 49843
2 7 49653 49664 49674 49683 49693 49702 49711 49720 49728 49736
28 49744 49752 4.97SO 49767 49774 49781 49768 49795 4S801 49807
2 9 49813 49819 49825 49831 49836 49841 498-46 4985t 49856 49861
Appen dices 429

Appen di:e 6

Som e of the Prop ertie s


of Gene raliz ed Func tions

When solvin g proble ms dealin g with rando m functi ons, it is often conve nient
-to use transf ormat ions involv ing variou s step functi ons as well as genera lized func-
-tions like the delta functi on.
Here are the defini tions and main proper ties of these functi ons of a real argu-
ment T.
1. 1T I, a modul us (absol ute value) :

I• I={ for T~O.

2. 1 (-r), a unit functi on (a unit jump) :


-·'t'
for 't'< o.

f for 't' > {J,


1
1 ('t') = -2 for 't'=O,
0 for 't' < o.
.3. sign T, the sign of the Yalue of T (the signum}:
1 for T>O
sign 't'= 0 for T=O,
-1 for 't <0
4. 8 (T}, the delta functi on:
d
cS('t)= d't 1.(-r).

The delta functi on is an even functi on of 't'. The main prope rties of the delta
iuncti on are
(a) -ro (T) := 0 and, in genera l, cp ('t) 6 ('t) = 0 if cp (-r) is an odd functi on con-
tinuou s at 't' = 0;
O+e
(b) J1Jl (-r) {, (-c) d't' = 1Jl (0) if the functi on 1Jl (t) is contin uous at the point
o-e
-. = 0 (e > 0);
o O+e
) 1Jl ('t) 0 ('t) d-r = 11Jl ('t') 0 ('t') dT =
0-e o
+1Jl (0)

if the functi on 1Jl (T) is contin uous at the point T = O.


These defini tions yield the following proper ties which hold for any real T and
any odd functi on cp ('t):
(1) 1T l =T sign T, (2) T= I T I sign T,
(3) cp (T) = cp (I T I) sign T, (4) <p (I T I)= <p (T) sign 't,
(5) cps (I 't !)=<p:a (T), (6) sign 't=2·1 ('t)-1 ,
430 Apptnd lcel

( 7)
1 (t }= ~ J gn ')1' -r- J
... t

(8) ! 't l = t [ 2 I ( -r) - 11 ,


dl-rl
(9} dt =sign T,

2
t "t r
( 10) d dti = a scgn
d-t
-r .:= 20 ('t)
I

T t

(11) f (T)= J 6('t)dT o=+ Jd(~Lgn't)


-oc -Or;:
Appendi ces 431!

APPEN DIX 7

A Table of the Relati onshi ps Betwe en T1 arious


Correlation Funct ions R:x: ('t)
and Spect ral Densi ties (ro) s:
1--------------,.:---------------1,
1. Var 6 (-o), where 6 (t) is a delta Var/(2:n)
function
2. Var Var 6 (oo)
3. Var cos ~-- (Var/2) [6 (oo+~J+o (oo-~)]
n n
4. ~ Vari cos ~iT ; ~ Vari[6 (co+~i)+o (co-~i)l
i=1 i=1
5. Var e -a 1-r 1 (Var cxj:n)(a 2+ oo 2)-1
n n
6. ~ Vari e-ai I -r 1 1 "' Varicxi
:It LJ
. 1
a2
t
+ (1)2
i=1 t=
Vara a2+~2+oo2
7. Var e-a I -r I cos ~·
:n [a2+(~-oo)2] [a2+(~+oo)2]
Vara 2(a2+~2)
8. Var e-a I -r 1(cos ~·+f sin~ 1•1) :n (co2+a2-~2)2+4a2~2
Var ex 2oo 2
9 . Var e-a I -r I (cos ~---i sin~ 1-ol) :n {oo2+a2+~2)2-4~2 00 2
Vara 2 (a2-~ 2 )
10. Var e-a I -r I (cosh~·
:n [(a-~)2+co2J[(a+~J2+co2)

+~ sinh~ 1•1) (a?<~)


Var ( sin (oo/2) ) 2
11. Var (1-ITI) 1(1-I-o D, where
1 (x) is a unit function 2:n 00/2
12. Var e-a 1-r 1 (1+a IT D (Var aj:n) 2a3 j(co 2+a 2 ) 2
Vara a4
13. Vare-a l-r1(1 +aiTI+ i-a 2 T2 ) :n 3 (co2+ cx2)3
Var a 16a.3 co 4
14. Vare- aiT1(1 +aiT/- 2a 2-r 2 :n (w2+a2) 4

+}as IT 13 )
15. 2a sin ~T/T a1 (1-1 oo I/~)
0 for 0 ~ I co I ~ ~.
sin ~T
16. 2a 2 (2 cos ~T-1) ,;.:__..~ • a 2 for ~ <I oo I ~ 2~.
0 for 2~ <I oo I
17. Var e -(a't) 2 Var
2cx y-;t
exp[-(
co
2a
rJ
18. Var e-a I -r I [26 (T)-a (sign T) 2 ) (Var a/:n)[co2 f(a2+ co2))
BIBLIOGRAPHY

1 Feller, \V An lntrodactlon to Pro b.a bthtv Theorg and l~1 Applfca tfons. 1ohn
\Vtle_y and Sons~
New York9 t962, t968
2 Gnedenko D V The Theorv of ProbtJbilftg, ltir Pubhsbcrs, ~loscow,
1982
J l\hiachzne A JJ'arA..r an the Tk.!orv of Queu~zt John lVdey and So-ns, New
Yorkt 1961
~ 0\ charov L A .A pp l ~ed Pro ble mt In the Qu eu~tng T heorv, l.lashrnoqtroenle
Pub11 she rs \ f o-= cow J9G9 ( 1n Russi an)
.5 Pugacbev. \ S Probabilttv Theory and Alathemahcal Statutu~r, Nauka,
~~OS COV. 1979
6 Pugacbev,. \'" S ProhabflJJv TheDTV and MDth~JJHJJical SIDJlstit3 /Dr Engi-
ne er1 Pergamon Press O.xford, 1984..
7 Takacs K An Introduction to the Thtorv of Qutuer Oxford Un1vers1ty
Press l\ew \ ork, t 9G2
8 Tarakanov l\ V, Ovcharov, L. At Tyrysbkin, A N Anal,tfcdf Alethods of
System Analyt[s Sovetskoe Radto Puhh!hers, !.Ioscow, t974 (tn Russtan).
9 \Ventzel., E S Probab~l&tv ThtorJJ~ Naukat Moscow, t969 {in Russian)"'
10 \Ventzel C S t Ovcharov, L. A. Probabi.Utv Theorv, Nauka, ~loscow, 1973 (In
Russtan)

TO THE RI:ADER

~hr Pubhshers would be grateful Cor your comments on tbe


content translation and design o[ thJs hook
lYe would a:1sa be pleased to recetve any ather suggestiOIIS
you may w1sh tn male
0 ur 8 ddress ts
lbr Pubhshers
2 PeNy R11.hsky Perculok
I 1tO GSP, :atoscowt 129820
USSR

Printed tn the Un fon of Sovtet Soctalftl Rep ubltc1

You might also like