Professional Documents
Culture Documents
Random Variables and Process
Random Variables and Process
Prof. B B Tiwari
Introduction
Signals may be deterministic or random. If uncertainty exists then signals are random signals. They are not predictable neither are they completely unpredictable. The probability of being correct can be predicted up to certain extent.
Probability
When the possible outcomes of an experiment is not always same we deal it with probability theory. For ex: when an experiment is repeated N times and the possible outcomes A occurs NA The relative frequency of occurrence of A is NA /N . It can be written as
. P(A)=
Totally independent events A1 and A2 with probabilities P(A1) and P(A2) are mutually exclusive events P(A1 or A2) = P(A1) + P(A2) In general P(A1 or A2 or .or AL) =
=1 ()
=1
= 1
If there are two events A and B one of the events may affect the other .In this case the conditional probability P(B|A) = P(A|B)*P(B)/P(A) This result is know as Bayes theorem. If A and B are totally independent then P(B|A) = P(B) and Joint probability P(A,B) = P(A)*P(B)
F(x)
Probability of outcome X being less then or equal to x1 is 1 P(X<=x1) = F(x1) = Similarly 2 P(X<=x2) = F(x2) = The Probability that the outcome lies in the range x1 <= X <= x2 P(x1 <= X <= x2 )=P(X<=x2) P(X < x1)
2 1
For two random variables X and Y the probability that x <= X <= x+dx while at same time y<= Y<= y+dy is given as P(x<=X<=x+dx ,y<=Y<= y+dy) = fXY(x,y)dxdy This is indicated by volume enclosed by fXY(x,y) P(x1 <= X <= x2, 2 2 y1<=Y<=y2)= ,
1 1
A communication example
We want to transmit one of two possible messages the message m0 that the bit 0 is intended or the message m1 that the bit 1 is intended. When received , generates some voltage , say r0 , which may be as simple as a dc voltage, while m1 received generates a voltage r1.
P(r0|m0)=probability that r0 is received given that m0 is sent, P(r1|m0)=probability that r1 is received given that m0 is sent, P(r=|m1)=probability that r0 is received given that m1 is sent, P(r1|m1)=probability that r1 is received given that m1 is sent,
The messages allow for the general case that the message m1 and m0 do not occur with equal frequency and we introduce the probabilities P(m1) and P(m0). These probabilities P(m1) and P(m0) are called the apriori probabilities. Now P(m0|ro)=probability that m0 is the message given that r0 is received, P(m1|r0)=probability that m1 is the message given that r0 is received.
Clearly if P(m0|r0)>p(m1|r0) then we should decide that m0 is intended and if the inequality is reversed we should decide for m1.Altogether then our algorithm should be : If r0 is received: Choose m0 if P(m0|r0)>P(m1|r0) Choose m1 if P(m1|r0)>P(m0|r0) If r1 is received: Choose m= if P(m0|r1)>P(m1|r1) Choose m1 if P(m1|r1)>P(m0|r1)
A receiver which operates in accordance with this algorithm is said to maximize the posteriori probability (m.a.p) of a correct decision and is called an optimum receiver . P(r0|m0)P(m0)>P(r0|m1)P(m1) P(r1|m1)P(m1)>P(r1|m0)P(m0)
Tchebycheffs Inequality
Proof: To prove this inequality we start with equation of variance shown in last slide. Assuming m=x=0 we have,
2 P(|X|>=)<= 2
2
=
+ 2
+
2 +
Since x2 >=0 and f(x) >= 0 for all x we have that 2 0 Can be written as 2 2 2 >= + + In the ranges - <= x< = - and <= x<= x2>= 2 Replace x2 by 2 + 2 2 >= [ + + ]
But P(x<=- )=
and P(x>=+ )=
2 2
X=
22
dx = m
E[(X-m)2]
/2 22
dx = 2
Error Function