Download as pdf or txt
Download as pdf or txt
You are on page 1of 15

UCSD ECE 153 Handout #11

Prof. Young-Han Kim Thursday, April 14, 2011

Solutions to Homework Set #2


(Prepared by Lele Wang)

1. Polya’s urn. Suppose we have an urn containing one red ball and one blue ball. We draw a
ball at random from the urn. If it is red, we put the drawn ball plus another red ball into the
urn. If it is blue, we put the drawn ball plus another blue ball into the urn. We then repeat
this process. At the n-th stage, we draw a ball at random from the urn with n + 1 balls, note
its color, and put the drawn ball plus another ball of the same color into the urn.

(a) Find the probability that the first ball is red.


(b) Find the probability that the second ball is red.
(c) Find the probability that the first three balls are all red.
(d) Find the probability that two of the first three balls are red.

Solution: Let Xi denote the color of the i-th ball.

(a) By symmetry, P{X1 = R} = 1/2.


(b) Again by symmetry, P{Xi = R} = 1/2 for all i. Alternatively, by the law of total
probability, we have

P{X2 = R} = P{X1 = R}P{X2 = R | X1 = R} + P{X1 = B}P{X2 = R | X1 = B}


1 2 1 1 1
= × + × = .
2 3 2 3 2

(c) By the chain rule, we have

P{X1 = R, X2 = R, X3 = R}
= P{X1 = R}P{X2 = R | X1 = R}P{X3 = R | X2 = R, X1 = R}
1 2 3 1
= × × = .
2 3 4 4

(d) Let N denote the number of red balls in the first three draws. From part (c), we know
that P{N = 3} = 1/4 = P{N = 0}, where the latter identity follows by symmetry. Also
we have P{N = 2} = P{N = 1}. Thus, P{N = 2} must be 1/4.

1
Alternatively, we have

P{N = 2} = P{X1 = B, X2 = R, X3 = R} + P{X1 = R, X2 = B, X3 = R}


+ P{X1 = R, X2 = R, X3 = B}
= P{X1 = B}P{X2 = R | X1 = B}P{X3 = R | X2 = R, X1 = B}
+ P{X1 = R}P{X2 = B | X1 = R}P{X3 = R | X2 = B, X1 = R}
+ P{X1 = R}P{X2 = R | X1 = R}P{X3 = B | X2 = R, X1 = R}
1 1 2 1 1 2 1 2 1 1
= × × + × × + × × = .
2 3 4 2 3 4 2 3 4 4

2. Uniform arrival. The arrival time of a professor to his office is uniformly distributed in the
interval between 8 and 9 am. Find the probability that the professor will arrive during the
next minute given that he has not arrived by 8:30. Repeat for 8:50.

Solution: Let A be the event that ”the professor arrives between 8:30 and 8:31”, and let B
be the event that ”the professor does not arrive before 8:30”. Then we have

P (A ∩ B) P (A) 1/60 1
P (A|B) = = = = .
P (B) P (B) 1/2 30

Similarly,
1
P({Professor arrives between 8:50 and 8:51}|{He does not arrive before 8:50}) = .
10

3. Probabilities from a cdf. The cdf of the random variable X is given by



 1, x > 0,
1 2 2
FX (x) = + (x + 1) , −1 ≤ x ≤ 0,
 3 3
0, x < −1.

Find the probabilities of the following events.

(a) A = {X > 13 }.
(b) B = {|X| ≥ 1}.
(c) C = {|X − 13 | < 1}.
(d) D = {X < 0}.

Solution: We have

P (A) = 1 − P (Ac )
= 1 − P {X ≤ 1/3}
= 1 − FX (1/3)
= 0.

2
P (B) = P {X ≤ −1} + P {X ≥ 1}
= FX (−1) + (1 − FX (1))
= 1/3.
P (C) = P {−2/3 < X < 4/3}
= FX (4/3) − FX (−2/3)
= 1 − 11/27
= 16/27.
P (D) = FX (0)
= 1.

4. Geometric with conditions. Let X be a geometric random variable with pmf

pX (k) = p(1 − p)k−1 , k = 1, 2, . . . .

Find and plot the conditional pmf pX (k|A) = P {X = k|X ∈ A} if:

(a) A = {X > m} where m is a positive integer.


(b) A = {X < m}.
(c) A = {X is an even number}.

Comment on the shape of the conditional pmf of part (a).

Solution:

(a) We have

X
P (A) = p(1 − p)n−1
n=m+1
X∞
= p(1 − p)n+m
n=0

X
= p(1 − p)m (1 − p)n
n=0
= (1 − p)m .

For k ≤ m, pX (k|A) = 0. For k > m,

pX (k|A) = P {X = k|X > m}


P {X = k}
=
P {X > m}
p(1 − p)k−1
=
(1 − p)m
= p(1 − p)k−m−1 .

3
(b) We have
m−2
X
P (A) = p(1 − p)n
n=0
1 − (1 − p)m−1
=p
1 − (1 − p)
= 1 − (1 − p)m−1 .

For k ≤ m or k ≤ 0, pX (k|A) = 0. For k < m,

pX (k|A) = P {X = k|X < m}


P {X = k}
=
P {X < m}
p(1 − p)k−1
= .
1 − (1 − p)m−1

(c) We have
X
P (A) = p(1 − p)n−1
n even

0
X
= p(1 − p)((1 − p)2 )n
n0 =0
p(1 − p)
=
1 − (1 − p)2
1−p
= .
2−p

For k odd, PX (k|A) = 0. For k even,

pX (k|A) = P {X = k|X is even}


P {X = k}
=
P {X is even}
p(1 − p)k−1
=
P (A)
= p(2 − p)(1 − p)k−2 .

Plots are shown in Figure 1. The shape of the conditional pmf in part (a) shows that the
geometric random variable is memoryless:

pX (x|X > k) = pX (x − k), for x ≥ k.

Note that in all three parts pX (x) is defined for all x. This is required.

4
(a)

0.4
pX(x|A)

0.3
0.2
0.1
0
0 2 4 6 8 10 12
(b)

0.4
pX(x|A)

0.3
0.2
0.1
0
0 2 4 6 8 10 12
(c)

0.4
pX(x|A)

0.3
0.2
0.1
0
0 2 4 6 8 10 12
x

1
Figure 1: Plots of the conditional pmf’s using p = 4 and m = 5.

5. Distance to the nearest star. Let the random variable N be the number of stars in a region
of space of volume V . Assume that N is a Poisson r.v. with pmf

e−ρV (ρV )n
pN (n) = , for n = 0, 1, 2, . . . ,
n!
where ρ is the ”density” of stars in space. We choose an arbitrary point in space and define
the random variable X to be the distance from the chosen point to the nearest star. Find the
pdf of X (in terms of ρ).

Solution: The trick in this problem, as in many others, is to find a way to connect events

5
regarding X with events regarding N . In our case, for x ≥ 0:

FX (x) = P{X ≤ x}
= 1 − P{X > x}
= 1 − P{No stars within distance x}
= 1 − P{N = 0 in sphere centered at origin of radius x}
4 3
= 1 − e−ρ 3 πx .

Now differentiating, we get


4 3
fX (x) = 4πρx2 e−ρ 3 πx .
For x < 0, both the cdf and the pdf are zero everywhere.

6. Random phase signal. Let Y (t) = sin(ωt + Θ) be a sinusoidal signal with random phase
Θ ∼ U [−π, π]. Find the pdf of the random variable Y (t) (assume here that both t and the
radial frequency ω are constant). Comment on the dependence of the pdf of Y (t) on time t.

Solution: We can easily see (by plotting y vs. θ) that for y ∈ (−1, 1)

P (Y ≤ y) = P (sin(ωt + Θ) ≤ y)
!
Z 0 Z sin−1 (y)
=2 fΘ (θ)dθ + fΘ (θ)dθ
− π2 0

2 sin−1 (y) + π2

=

−1
sin (y) 1
= + .
π 2
By differentiating with respect to y, we get
1
fY (y) = p .
π 1 − y2

Note that fY (y) does not depend on time t, i.e., is time invariant (or stationary) (more on
this later in the course).

7. Quantizer. Let X ∼ Exp(λ), i.e., an exponential random variable with parameter λ and
Y = bXc, i.e., Y = k for k ≤ X < k + 1, k = 0, 1, 2, . . .. Find the pmf of Y . Define the
quantization error Z = X − Y . Find the pdf of Z.

6
f X (x) z width bands that go from k+z to k+z+ z

a)

I II III IV V
x
0 1 2 3 4 5 6 7

z z+ z

p (k)
Y

Area of region I
b)
Area of II

Area of III

Area of IV
V
etc...

k
0 1 2 3 4 5 6 7

f Z (z) z width band that goes from z to z+ z

c)

z
0 1 2 3 4 5 6 7

z z+ z

Figure 2: a) pdf of X, b) pmf of Y, c) pdf of Z

7
Solution: For k < 0, pY (k) = 0. Elsewhere

pY (k) = P {Y = k}
= P {k ≤ X < k + 1}
= FX (k + 1) − FX (k)
   
= 1 − e−λ(k+1) − 1 − e−λk
= e−λk − e−λ(k+1)
 
= e−λk 1 − e−λ .

Since Z = X − Y = X − bXc is the fractional part of X, fZ (z) = 0 for z < 0 or z ≥ 1. For


0 ≤ z < 1, we have

FZ (z) = P (Z ≤ z)
X∞
= P (k ≤ X ≤ k + z)
k=0

X
= e−λk − e−λ(k+z)
k=0
1 − e−λz
= .
1 − e−λ
By differentiating with respect to z, we get

λe−λz
fZ (z) =
1 − e−λ
for 0 ≤ z ≤ 1.

8. Gambling. Alice enters a casino with one unit of capital. She looks at her watch to generate
a uniform random variable U ∼ unif [0, 1], then bets the amount U on a fair coin flip. Her
wealth is thus given by the r.v.

1 + U, with probability 1/2,
X=
1 − U, with probability 1/2.

Find the cdf of X.

Solution: For x < 0, the cdf is zero, and for x > 2 the cdf is one. In between, w e can find

8
the cdf using the law of total probability. So consider

FX (x) = P{X ≤ x}
= P{X ≤ x, H} + P{X ≤ x, T}
1
= (P{X ≤ x|H} + P{X ≤ x|T})
2
1
= (P{U ≤ x − 1} + P{U ≥ 1 − x})
2
1

2 (0 + x), 0≤x≤1
= 1
2 (x − 1 + 1), 1<x≤2
x
=
2
for 0 < x ≤ 2.
Thus, X ∼ Unif[0, 2].

9
Solution to Additional Exercises

1. Juror’s fallacy. Suppose that P (A|B) ≥ P (A) and P (A|C) ≥ P (A). Is it always true that
P (A|B, C) ≥ P (A) ? Prove or provide a counterexample.

Solution: The answer is no. There are many counterexamples that can be given. For
example, suppose a fair die is thrown and let X denote the number of dots. Let A be the
event that X = 3 or 6; let B be the event that X = 3 or 5; and let C be the event that X = 5
or 6. Then, we have

P (A) = 1/3, P (A|B) = P (A|C) = 1/2, but P (A|B, C) = 0.

Apparently, having two positive evidences does not necessarily lead to a stronger evidence.

2. Probabilities from a cdf. Let X be a random variable with the cdf shown below.
F(x)

2/3

1/3

1/3x

x
1 2 3 4

Find the probabilities of the following events.

(a) {X = 2}.
(b) {X < 2}.
(c) {X = 2} ∪ {0.5 ≤ X ≤ 1.5}.
(d) {X = 2} ∪ {0.5 ≤ X ≤ 3}.

Solution:

(a) There is a jump at X = 2, so we have

P{X = 2} = P{X ≤ 2} − P{X < 2}


= F (2+) − F (2−)
2 1
= −
3 3
1
= .
3

(b) P{X < 2} = F (2−) = 31 .

10
(c) since {X = 2} and {0.5 ≤ X ≤ 1.5} are two disjoint events,

P({X = 2} ∪ {0.5 ≤ X ≤ 1.5}) = P{X = 2} + P{0.5 ≤ X ≤ 1.5}


1
= + F (1.5) − F (0.5)
3
1 1 1
= + − × 0.52
3 3 3
7
= .
12

(d) We have

P({X = 2} ∪ {0.5 ≤ X ≤ 3}) = P{0.5 ≤ X ≤ 3}


= F (3) − F (0.5)
5 1
= − × 0.52
6 3
3
= .
4

3. Gaussian probabilities. Let X ∼ N (1000, 400). Express the following in terms of the Q
function.

(a) P{0 < X < 1020}.


(b) P{X < 1020|X > 960}.

X−µ
Solution: Using the fact that σ ∼ N (0, 1), thus F (x) = Φ( x−µ x−µ
σ ) = 1 − Q( σ ).

(a) We have
   
0 − 1000 1020 − 1000
P{0 < X < 1020} = Q −Q = Q(−50) − Q(1).
20 20

(b) We have

P{960 < X < 1020}


P{X < 1020|X > 960} =
P{X > 960}
Q( 20 ) − Q( 1020−1000
960−1000
20 )
= 960−1000
Q( 20 )
Q(−2) − Q(1)
= .
Q(−2)

11
4. Consider the Laplacian random variable X with pdf f (x) = 21 e−|x| .

(a) Sketch the cdf of X.


(b) Find P{|X| ≤ 2 or X ≥ 0} .
(c) Find P{|X| + |X − 3| ≤ 3} .

Solution:

(a) We have
Z x  1 x
1 −|u| 2e , if x < 0
FX (x) = e du =
−∞ 2 1 − 12 e−x , if x ≥ 0.

0.9

0.8

0.7

0.6
FX(x)

0.5

0.4

0.3

0.2

0.1

0
−6 −4 −2 0 2 4 6
x

Figure 3: cdf of X

(b) We have

P{|X| ≤ 2 or X ≥ 0} = P{X ≥ −2}


= 1 − P{X < −2}
Z −2
1 −|x|
=1− e dx
−∞ 2
1
= 1 − e−2 .
2
(c) We have

P{|X| + |X − 3| ≤ 3} = P {0 ≤ X ≤ 3}
Z 3
1 −|x|
= e dx
0 2
1 1
= − e−3 .
2 2

12
5. Let A be a nonzero probability event A. Show that

(a) P(A) = P(A|X ≤ x)FX (x) + P(A|X > x)(1 − FX (x)).


P(A|X ≤ x)
(b) FX (x|A) = FX (x).
P(A)

Solution:

(a) Using the fact that {X ≤ x} and {X > x} are events that partition Ω, we have

P(A) = P(A| X ≤ x)P(X ≤ x) + P(A| X > x)P(X > x)


= P(A| X ≤ x)P(X ≤ x) + P(A| X > x)(1 − P(X ≤ x))
= P(A| X ≤ x)FX (x) + P(A| X > x)(1 − FX (x))

(b) By the definition of conditional probability and the chain rule,

FX (x| A) = P (X ≤ x| A)
P({X ≤ x} ∩ A)
=
P(A)
P(A| X ≤ x)P(X ≤ x)
=
P(A)
P(A| X ≤ x)
= FX (x).
P(A)

6. Time until the first arrival. Let the random variable N be the number of packet arrivals
during the time interval [0, t]. Assume that N is a Poisson r.v. with pmf

e−λt (λt)n
pN (n) = , for n = 0, 1, 2, . . . ,
n!
where λ is the rate of packet arrivals. Define the random variable X to be the time until the
first packet arrives. Find the pdf of X (in terms of λ).

Solution: To find the pdf of X, we first consider P{X > t}, which is the probability that
time of the first arrival is greater than t, that is the number of packet arrivals during the time
interval [0, t] is zero.

P{X > t} = P{N = 0 in time interval [0, t]}


= pN (0)
= e−λt for t ≥ 0.

Thus, FX (x) = P{X ≤ x} = 1 − P{X > x} = 1 − e−λx .


Taking derivative with respect to x, we have

fX (x) = λe−λx for x ≥ 0.

13
7. Nonlinear processing. Let X ∼ Unif [−1, 1]. Define the random variable
 2
X + 1, if |X| ≥ 0.5
Y =
0, otherwise.

Find and sketch the cdf of Y .

Solution: From the definition, Y ∈ [0, 2]. Thus, FY (y) = 0 for y < 0 and FY (y) = 1 for
y ≥ 2. For y ∈ [0, 1.25),

P{Y ≤ y} = P{Y = 0}
= P{|X| < 0.5}
1
= .
2
For y ∈ [1.25, 2),

P{Y ≤ y} = P{Y = 0} + P{1.25 ≤ Y ≤ y}


1
= + P{X 2 + 1 ≤ y}
2
1 p p
= + P{− y − 1 ≤ X ≤ −0.5 or 0.5 ≤ X ≤ y − 1}
2
1 p 1
= + y−1−
2
p 2
= y − 1.

In sum, the cdf of Y is:




 0, if y<0
1/2, if 0 ≤ y ≤ 1.25

FY (y) = √

 y − 1, if 1.25 ≤ y < 2
1, if y ≥ 2.

0.8

0.6
FY(y)

0.4

0.2

−0.2
−0.5 0 0.5 1 1.5 2 2.5
y

Figure 4: cdf of Y

14
8. Lognormal distribution. Let X ∼ N (0, σ 2 ). Find the pdf of Y = eX (known as the lognormal
pdf).

Solution: Y = eX > 0 implies fY (y) = 0 if y ≤ 0. For y > 0

P(Y ≤ y) = P(eX ≤ y) = P(X ≤ ln(y)) = FX (ln(y))

taking derivative with respect to y,

1 1 1 (ln(y))2
fY (y) = fX (ln(y)) = √ e− 2σ2 for y > 0.
y y 2πσ

15

You might also like