Download as pdf or txt
Download as pdf or txt
You are on page 1of 10

UCSD ECE250 Handout #6

Prof. Young-Han Kim Wednesday, January 21, 2015

Solutions to Exercise Set #2


(Prepared by TA Fatemeh Arbabjolfaei)

1. Juror’s fallacy. Suppose that P (A|B) ≥ P (A) and P (A|C) ≥ P (A). Is it always true that
P (A|B, C) ≥ P (A) ? Prove or provide a counterexample.

Solution: The answer is no. There are many counterexamples that can be given. For
example, suppose a fair die is thrown and let X denote the number of dots. Let A be the
event that X = 3 or 6; let B be the event that X = 3 or 5; and let C be the event that X = 5
or 6. Then, we have
P (A) = 1/3, P (A|B) = P (A|C) = 1/2, but P (A|B, C) = 0.
Apparently, having two positive evidences does not necessarily lead to a stronger evidence.

2. Polya’s urn. Suppose we have an urn containing one red ball and one blue ball. We draw a
ball at random from the urn. If it is red, we put the drawn ball plus another red ball into the
urn. If it is blue, we put the drawn ball plus another blue ball into the urn. We then repeat
this process. At the n-th stage, we draw a ball at random from the urn with n + 1 balls, note
its color, and put the drawn ball plus another ball of the same color into the urn.

(a) Find the probability that the first ball is red.


(b) Find the probability that the second ball is red.
(c) Find the probability that the first three balls are all red.
(d) Find the probability that two of the first three balls are red.

Solution: Let Xi denote the color of the i-th ball.


(a) By symmetry, P{X1 = R} = 1/2.
(b) Again by symmetry, P{Xi = R} = 1/2 for all i. Alternatively, by the law of total
probability, we have
P{X2 = R} = P{X1 = R}P{X2 = R | X1 = R} + P{X1 = B}P{X2 = R | X1 = B}
1 2 1 1 1
= × + × = .
2 3 2 3 2
(c) By the chain rule, we have
P{X1 = R, X2 = R, X3 = R}
= P{X1 = R}P{X2 = R | X1 = R}P{X3 = R | X2 = R, X1 = R}
1 2 3 1
= × × = .
2 3 4 4

1
(d) Let N denote the number of red balls in the first three draws. From part (c), we know
that P{N = 3} = 1/4 = P{N = 0}, where the latter identity follows by symmetry. Also
we have P{N = 2} = P{N = 1}. Thus, P{N = 2} must be 1/4.
Alternatively, we have

P{N = 2} = P{X1 = B, X2 = R, X3 = R} + P{X1 = R, X2 = B, X3 = R}


+ P{X1 = R, X2 = R, X3 = B}
= P{X1 = B}P{X2 = R | X1 = B}P{X3 = R | X2 = R, X1 = B}
+ P{X1 = R}P{X2 = B | X1 = R}P{X3 = R | X2 = B, X1 = R}
+ P{X1 = R}P{X2 = R | X1 = R}P{X3 = B | X2 = R, X1 = R}
1 1 2 1 1 2 1 2 1 1
= × × + × × + × × = .
2 3 4 2 3 4 2 3 4 4

3. Probabilities from a cdf. Let X be a random variable with the cdf shown below.
F (x)

2/3

1/3
1 2
3x
1 2 3 4
x

Find the probabilities of the following events.

(a) {X = 2}.
(b) {X < 2}.
(c) {X = 2} ∪ {0.5 ≤ X ≤ 1.5}.
(d) {X = 2} ∪ {0.5 ≤ X ≤ 3}.

Solution:

(a) There is a jump at X = 2, so we have

P{X = 2} = P{X ≤ 2} − P{X < 2}


= F (2) − F (2− )
2 1
= −
3 3
1
= .
3

(b) P{X < 2} = F (2− ) = 13 .

2
(c) since {X = 2} and {0.5 ≤ X ≤ 1.5} are two disjoint events,

P({X = 2} ∪ {0.5 ≤ X ≤ 1.5}) = P{X = 2} + P{0.5 ≤ X ≤ 1.5}


1
= + F (1.5) − F (0.5− )
3
1 1 1
= + − × 0.52
3 3 3
7
= .
12

(d) We have

P({X = 2} ∪ {0.5 ≤ X ≤ 3}) = P{0.5 ≤ X ≤ 3}


= F (3) − F (0.5− )
5 1
= − × 0.52
6 3
3
= .
4

4. Gaussian probabilities. Let X ∼ N (1000, 400). Express the following in terms of the Q
function.

(a) P{0 < X < 1020}.


(b) P{X < 1020|X > 960}.

X−µ
Solution: Using the fact that σ ∼ N (0, 1), thus F (x) = Φ( x−µ x−µ
σ ) = 1 − Q( σ ).

(a) We have
   
0 − 1000 1020 − 1000
P{0 < X < 1020} = Q −Q = Q(−50) − Q(1).
20 20

(b) We have

P{960 < X < 1020}


P{X < 1020|X > 960} =
P{X > 960}
Q( 20 ) − Q( 1020−1000
960−1000
20 )
= 960−1000
Q( 20 )
Q(−2) − Q(1)
= .
Q(−2)

5. Laplacian. Let X ∼ f (x) = 12 e−|x| .

(a) Sketch the cdf of X.


(b) Find P{|X| ≤ 2 or X ≥ 0} .

3
(c) Find P{|X| + |X − 3| ≤ 3} .
(d) Find P{X ≥ 0 | X ≤ 1} .

Solution:
(a) We have
x  1 x
1 −|u|
Z
FX (x) = e du = 2e , if x < 0
−∞ 2 1 − 12 e−x , if x ≥ 0.

0.9

0.8

0.7

0.6
F (x)

0.5
X

0.4

0.3

0.2

0.1

0
−6 −4 −2 0 2 4 6
x

Figure 1: cdf of X

(b) We have
P{|X| ≤ 2 or X ≥ 0} = P{X ≥ −2}
= 1 − P{X < −2}
Z −2
1 −|x|
=1− e dx
−∞ 2
1
= 1 − e−2 .
2
(c) We have
P{|X| + |X − 3| ≤ 3} = P {0 ≤ X ≤ 3}
Z 3
1 −|x|
= e dx
0 2
1 1
= − e−3 .
2 2
(d) We have
1
P{0 ≤ X ≤ 1} FX (1) − FX (0− ) − e−1
P{X ≥ 0 | X ≤ 1} = = = 2 1 −1 .
P{X ≤ 1} FX (1) 1 − 2e

4
6. Distance to the nearest star. Let the random variable N be the number of stars in a region
of space of volume V . Assume that N is a Poisson r.v. with pmf
e−ρV (ρV )n
pN (n) = , for n = 0, 1, 2, . . . ,
n!
where ρ is the ”density” of stars in space. We choose an arbitrary point in space and define
the random variable X to be the distance from the chosen point to the nearest star. Find the
pdf of X (in terms of ρ).

Solution: The trick in this problem, as in many others, is to find a way to connect events
regarding X with events regarding N . In our case, for x ≥ 0:

FX (x) = P{X ≤ x}
= 1 − P{X > x}
= 1 − P{No stars within distance x}
= 1 − P{N = 0 in sphere centered at origin of radius x}
4 3
= 1 − e−ρ 3 πx .

Now differentiating, we get


4 3
fX (x) = 4πρx2 e−ρ 3 πx .
For x < 0, both the cdf and the pdf are zero everywhere.

7. Lognormal distribution. Let X ∼ N (0, σ 2 ). Find the pdf of Y = eX (known as the lognormal
pdf).

Solution: Y = eX > 0 implies fY (y) = 0 if y ≤ 0. For y > 0

P(Y ≤ y) = P(eX ≤ y) = P(X ≤ ln(y)) = FX (ln(y))

taking derivative with respect to y,


1 1 1 (ln(y))2
fY (y) = fX (ln(y)) = √ e− 2σ2 for y > 0.
y y 2πσ

8. Random phase signal. Let Y (t) = sin(ωt + Θ) be a sinusoidal signal with random phase
Θ ∼ U [−π, π]. Find the pdf of the random variable Y (t) (assume here that both t and the
radial frequency ω are constant). Comment on the dependence of the pdf of Y (t) on time t.

Solution: We can easily see (by plotting y vs. θ) that for y ∈ (−1, 1)

P (Y ≤ y) = P (sin(ωt + Θ) ≤ y)
= P (sin(Θ) ≤ y)
2 sin−1 (y) + π2

=

sin−1 (y) 1
= + .
π 2

5
By differentiating with respect to y, we get
1
fY (y) = p .
π 1 − y2

Note that fY (y) does not depend on time t, i.e., is time invariant (or stationary) (more on
this later in the course).

9. Quantizer. Let X ∼ Exp(λ), i.e., an exponential random variable with parameter λ and
Y = ⌊X⌋, i.e., Y = k for k ≤ X < k + 1, k = 0, 1, 2, . . .. Find the pmf of Y . Define the
quantization error Z = X − Y . Find the pdf of Z.

Solution: For k < 0, pY (k) = 0. Elsewhere

pY (k) = P {Y = k}
= P {k ≤ X < k + 1}
= FX (k + 1) − FX (k)
   
= 1 − e−λ(k+1) − 1 − e−λk
= e−λk − e−λ(k+1)
 
= e−λk 1 − e−λ .

Since Z = X − Y = X − ⌊X⌋ is the fractional part of X, fZ (z) = 0 for z < 0 or z ≥ 1. For


0 ≤ z < 1, we have

FZ (z) = P (Z ≤ z)
X∞
= P (k ≤ X ≤ k + z)
k=0

X
= e−λk − e−λ(k+z)
k=0
1 − e−λz
= .
1 − e−λ
By differentiating with respect to z, we get
λe−λz
fZ (z) =
1 − e−λ
for 0 ≤ z < 1.
Refer to Figure 2 for a graphical explanation of the above.

10. Geometric with conditions. Let X be a geometric random variable with pmf

pX (k) = p(1 − p)k−1 , k = 1, 2, . . . .

Find and plot the conditional pmf pX (k|A) = P {X = k|X ∈ A} if:

6
f X (x) z width bands that go from k+z to k+z+ z

a)

I II III IV V
x
0 1 2 3 4 5 6 7

z z+ z

p (k)
Y

Area of region I
b)
Area of II

Area of III

Area of IV
V
etc...

k
0 1 2 3 4 5 6 7

f Z (z) z width band that goes from z to z+ z

c)

z
0 1 2 3 4 5 6 7

z z+ z

Figure 2: a) pdf of X, b) pmf of Y, c) pdf of Z

7
(a) A = {X > m} where m is a positive integer.
(b) A = {X < m}.
(c) A = {X is an even number}.

Comment on the shape of the conditional pmf of part (a).

Solution:

(a) We have

X
P (A) = p(1 − p)n−1
n=m+1
X∞
= p(1 − p)n+m
n=0

X
m
= p(1 − p) (1 − p)n
n=0
m
= (1 − p) .

For k ≤ m, pX (k|A) = 0. For k > m,

pX (k|A) = P {X = k|X > m}


P {X = k}
=
P {X > m}
p(1 − p)k−1
=
(1 − p)m
= p(1 − p)k−m−1 .

(b) We have
m−2
X
P (A) = p(1 − p)n
n=0
1 − (1 − p)m−1
=p
1 − (1 − p)
= 1 − (1 − p)m−1 .

For k ≥ m or k ≤ 0, pX (k|A) = 0. For 0 < jk < m,

pX (k|A) = P {X = k|X < m}


P {X = k}
=
P {X < m}
p(1 − p)k−1
= .
1 − (1 − p)m−1

8
(c) We have
X
P (A) = p(1 − p)n−1
n even
X∞

= p(1 − p)((1 − p)2 )n
n′ =0
p(1 − p)
=
1 − (1 − p)2
1−p
= .
2−p

For k odd, PX (k|A) = 0. For k even,

pX (k|A) = P {X = k|X is even}


P {X = k}
=
P {X is even}
p(1 − p)k−1
=
P (A)
= p(2 − p)(1 − p)k−2 .

Plots are shown in Figure 3. The shape of the conditional pmf in part (a) shows that the
geometric random variable is memoryless:

pX (x|X > k) = pX (x − k), for x ≥ k.

Note that in all three parts pX (x) is defined for all x. This is required.

9
(a)

0.4
pX(x|A)

0.3
0.2
0.1
0
0 2 4 6 8 10 12
(b)

0.4
pX(x|A)

0.3
0.2
0.1
0
0 2 4 6 8 10 12
(c)

0.4
pX(x|A)

0.3
0.2
0.1
0
0 2 4 6 8 10 12
x

1
Figure 3: Plots of the conditional pmf’s using p = 4 and m = 5.

10

You might also like