Download as ppt, pdf, or txt
Download as ppt, pdf, or txt
You are on page 1of 62

Addis Ababa Science & Technology University

Department of Electrical & Computer Engineering

Probability and Random Process (EEEg-2114)

Chapter 2: Random Variables


Outline
 Random Variables (Rvs)
 The Cumulative Distribution Function (CDF) of RVs
 Types of RVs
 Continuous Random Variable (CRV)
 Discrete Random Variable (DRV)
 Probability Density Function (PDF) of CRV
 Probability Mass Function (PMF) of DRV
 Expected Value & Variance of RVs
 Some Distributions(RVs) with their unique application
 DRV Type:
Type Bernoulli, binomial, poisson, Geometric, hypergeometric,
negative binomial
 CRV Type:
Type Uniform, exponential, Laplace, Cauchy, Gaussian, Gamma,
Chi-squared, Rayleigh

2
Random variable (RV) X
 is a function that assigns a real number X(ω) to each outcome ω in the
sample space Ω of a random expt.
 Its domain D is the sample space Ω & its range RX is the the set
of all values taken on by X which is the subset of all real numbers
 It is represented by capital letters (like X,Y or W) & any its
particular value by lowercase letter such as x, y or w.
 It is important because it provides a compact way of referring to
events via their numerical attributes. For example, if X models
the number of visits to a website, it is much easier to write
P(X > 1000) than to write P(number of visits > 1000).

3
Conditions for a function to be a RV.
A function X to be a RV:
1. It should not be multi-valued.
multi-valued i.e every point in the Ω
must correspond to only one value of the RV.(one-to-one OR
many-to-one)
2. The set {X≤x} shall be an event for any real number x. This set
corresponds to those points ω in Ω for which the RV X(ω)
doesn’t exceed the number x.
The probability of this event, P{X≤x}, is equal to sum of
probability of all the elementary events corresponding to
{X≤x}.Call it Cumulative distribution Function (CDF)
3. P{X=∞}=0 and P{X=-∞}=0 .outcomes chance of being infinity

4
Example:
Consider a random experiment of tossing a fair coin 3 times. The
sequence of heads and tails is noted and the sample space Ω is
given by:   {HHH , HHT , HTH , THH , THT , HTT , TTH , TTT}
Let X be the number of heads in three coin tosses.
X assigns each possible outcome ω in the sample space Ω a
number from the set RX={0, 1, 2, 3}.

Fig: Illustration of a random variable X that counts the number of heads


in a sequence of three coin tosses. 5
The Cumulative Distribution Function
 The cumulative distribution function (cdf) of a random variable
X is defined as the probability of the event {X≤ x}.
FX ( x)  P ( X  x)
Properties of the cdf, FX(x):
 The cdf has the following properties.
i. FX ( x) is a non - negative function, i.e.,
0  FX ( x)  1
ii. lim FX ( x)  1
x 

iii. lim FX ( x)  0
x  

6
The Cumulative Distribution Function Cont’d…..
iv. FX ( x) is a non - decreasing function of X , i.e.,
If x1  x2 , then FX ( x1 )  FX ( x2 )
v. P ( x1  X  x2 )  FX ( x2 )  FX ( x1 )

vi. P ( X  x)  1  FX ( x)

7
Example: Find the cdf of the random variable X which is
defined as the number of heads in three tosses of a fair coin.
Solution:
We know that X takes on only the values 0, 1, 2 and 3 with
probabilities 1/8, 3/8, 3/8 and 1/8 respectively.
Thus, FX(x) is simply the sum of the probabilities of the outcomes
from the set {0, 1, 2, 3} that are less than or equal to x.

0, x  0
1 / 8, 0  x  1

 FX ( x)  1 / 2, 1  x  2
7 / 8, 2  x  3

1, x  3 8
Types of Random Variables
 There are two basic types of random variables.
i. Continuous Random Variable
 A random variable whose cdf, FX(x), is continuous every where and can
be written as an integral of some non-negative function f(x), i.e.,

FX ( x)   f (u )du


 have an uncountably infinite number of sample points


ii. Discrete Random Variable
 A random variable whose cdf, FX(x), is a right continuous, staircase
function of X with jumps at a countable set of points x0, x1, x2,……
 Defined over a sample space having a finite or a countably infinite
number of sample points
9
The Probability Density Function
 The probability density function (pdf) of a continuous random
variable X is defined as the derivative of the cdf, FX(x), i.e.,
dFX ( x)
f X ( x) 
dx
Properties of the pdf, fX(x):
i. For all values of X , f X ( x)  0

ii. 

f X ( x)dx  1
x2
iii. P( x1  X  x2 )   f X ( x)dx
x1

10
The Probability Mass Function
 The probability mass function (pmf)
(pmf of a discrete random
variable X is defined as:

PX ( X  xi )  PX ( xi )  FX ( xi )  FX ( xi 1 )

Properties of the pmf, PX (xi ):


i. 0  PX ( xi )  1, k  1, 2, .....

ii. PX ( x)  0, if x  xk , k  1, 2, .....

iii. Pk
X ( xk )  1

11
Calculating the Cumulative Distribution Function
 The cdf of a continuous random variable X can be obtained
by integrating the pdf, i.e.,
x
FX ( x)   f X (u )du


 Similarly, the cdf of a discrete random variable X can be


obtained by using the formula:

FX ( x)  P
xk  x
X ( xk )U ( x  xk )

12
Expected Value, Variance and Moments
I. Expected Value (Mean)
 The expected value (mean) of a continuous random variable X,
denoted by μX or E(X), is defined as:


 X  E ( X )   xf X ( x)dx


 Similarly, the expected value of a discrete random variable X is


given by:
 X  E ( X )   xk PX ( xk )
k

 Mean represents the average value of the random variable in a very


13
large number of trials.
Expected Value, Variance and Moments Cont’d…..
II. Variance
 The variance of a continuous random variable X, denoted by
σ2X or VAR(X), is defined as:
 2 X  Var ( X )  E[( X   X ) 2 ]

 2
X  Var ( X )   ( X   X ) 2 f X ( x)dx


 Expanding (X-μX )2 in the above equation and simplifying the


resulting
2 equation, we will get:
 X  Var ( X )  E ( X )  [ E ( X )]
2 2

 The variance
 2 X  of
Vara (discrete 
X )  random
( x  variable
k
k) 2 P ( xX )is given by:
X X k
14
Expected Value, Variance and Moments Cont’d…..
 The standard deviation of a random variable X, denoted by σX, is
simply the square root of the variance, i.e.,
 X  E ( X   X ) 2  Var ( X )
III. Moments
 The nth moment of a continuous random variable X is defined as:

E ( X )   x n f X ( x)dx ,
n
n 1


 Similarly, the nth moment of a discrete random variable X is given


by:

 Mean of X is the first moment of the random variable X


15
Random Variable Examples

Example-1: (continuous rv)


The pdf of a continuous random variable is given by:

kx , 0  x 1
f X ( x)  
0 , otherwise

where k is a constant.
a. Determine the value of k .
b. Find the corresponding cdf of X .
c. Find P(1 / 4  X  1)
d . Evaluate the mean and variance of X .
16
Random Variable Examples Cont’d……

Solution:
 1
a. 
f X ( x ) dx  1  
0
kxdx  1
 x2 1
 k    1
 2 0
k
 1
2
k  2

2 x , 0  x 1
 f X ( x)  
0, otherwise
17
Random Variable Examples Cont’d……

Solution:
b. The cdf of X is given by :
x
FX ( x )   
f X (u ) du
Case 1 : for x  0
FX ( x )  0, since f X ( x )  0, for x  0
Case 2 : for 0  x  1
x x x
FX ( x )   f X (u ) du   2udu  u  x2
2
0 0 0

18
Random Variable Examples Cont’d……

Solution:
Case 3 : for x  1
1 1 1
FX ( x )   f X (u ) du   2udu  u 1
2
0 0 0
 The cdf is given by
0, x0
 2
FX ( x )   x , 0  x 1
1, x 1

19
Random Variable Examples Cont’d……

Solution:
c. P (1 / 4  X  1)
i. Using the pdf
1 1
P (1 / 4  X  1)   f X ( x) dx   2 xdx
1/ 4 1/ 4

1
 P (1 / 4  X  1)  x 2
 15 / 16
1/ 4
 P (1 / 4  X  1)  15 / 16
ii. Using the cdf
P (1 / 4  X  1)  FX (1)  FX (1 / 4)
 P (1 / 4  X  1)  1  (1 / 4) 2  15 / 16
 P (1 / 4  X  1)  15 / 16
20
Random Variable Examples Cont’d……

Solution:
d. Mean and Variance
i. Mean
1 1
 X  E ( X )   xf X ( x)dx   2 x 2 dx
0 0

2 x3 1
 X   2/3
3 0
ii. Variance
 X 2  Var ( X )  E ( X 2 )  [ E ( X )]2
1 1
E ( X )   x f X ( x) dx   2 x 3 dx  1 / 2
2 2
0 0

  X  Var ( x)  1 / 2  ( 2 / 3) 2  1 / 18
2

21
Example 2: Let the random variable X have cdf

Find the density and sketch both the cdf and pdf.

Solution

22
Random Variable Examples Cont’d……..

Example-3: (discrete rv)


Consider a discrete random variable X whose pmf is given by:
1 / 3 , xk  1, 0, 1
PX ( xk )  
0 , otherwise

Find the mean and variance of X .

23
Random Variable Examples Cont’d……

Solution:
i. Mean
1
 X  E( X )  x
k  1
k PX ( xk )  1 / 3(1  0  1)  0

ii. Variance
 X 2  Var ( X )  E ( X 2 )  [ E ( X )]2
1

 k X k
2
E( X ) 2
x P ( x )  1 / 3[( 1) 2
 ( 0) 2
 (1) 2
]  2/3
k  1

  X  Var ( x)  2 / 3  (0) 2  2 / 3
2

24
Example 3: (mixed rv) : Consider the generalized density

Compute P(0 <Y ≤ 7), P(Y = 0),


Solution. In computing
the impulse at the origin makes no contribution, but the
impulse at 7 does. Thus,

Similarly, in computing P(Y = 0) = P(Y ∈ {0}), only the impulse


at zero makes a contribution. Thus,

25
Exercises
1 . A random variable X has generalized density

where u is the unit step function , and δ is the Dirac delta function .
(a) Sketch f (t).
(b) Compute P(X = 0) and P(X = 1).
(c) Compute P(0 < X < 1) and P(X > 1).
(d) Use your above results to compute P(0 ≤ X ≤ 1) & P(X ≥ 1).
(e) Compute E[X].

2 : Show that E[X] = 7/12 if X has cdf

26
3. The continuous random variable X has the pdf given by:

4. The cdf of continuous random variable X is given by:

27
5. A r.v. X is defined by the cdf

6. A random variable X has mean 2 and variance 7. Find 𝐸 ሾ𝑋 2.ሿ

7. Let X be a random variable with mean m and variance σ2


Find the constant c that best approximates the random variable X in the
sense that c minimizes the mean-squared error E[(X −c)2].
8. Let X have the Pareto density f (x)=2/x3 for x≥1 and f (x)=0 otherwise.
Compute E[X], E[X2].
28
Some Special Distributions with their Special application
I. Discrete Probability Distributions
1. Bernoulli Distribution
 A r.v. X is called a Bernoulli r.v. with parameter p if its pmf
is given by

It is associated with some experiment where an outcome can


be classified as either a "success" or a "failure," and the
probability of a success is p and the probability of a failure is 1 - p.
Its cdf FX x  is given by
Its mean and variance are

29
2. Binomial Distribution
 A r.v. X is called a binomial r.v. with parameters (n, p) if its pmf is

 n  k n k
P( X  k )  
k p q , k  0,1,2,  , n.
 
It is associated with some experiments in which n
independent Bernoulli trials are performed and X represents
the number of successes that occur in the n trials
a Bernoulli r.v. is just a binomial r.v. with parameters (1, p).
Its mean and variance are

Just n times the mean & variance of Bernoulli 30


The binomial theorem says that for any complex
numbers a and b,

the quantity is called the binomial coefficient. It is convenient


to know that the binomial coefficients can be read off from the
nth row of Pascal’s triangle in Figure below. Noting that the top row
is row 0, it is immediately seen, for example, that

31
Example (Binomial):
Binomial)
A homeowner has just installed 20 light bulbs in a new home.
Suppose that each has a probability 0.2 of functioning more than
three months. (a)What is the probability that at least five of these
function more than three months? (b)What is the average number
of bulbs the homeowner has to replace in three months?
Solution: it is reasonable to assume that the light bulbs perform
independently. If X is the number of bulbs functioning more than
three months (success), it has a binomial distribution with n=20
and p=0.2.

32
EXAMPLE : A communications system consists of n components,
each of which will, independently, function with probability p. The
total system will be able to operate effectively if at least one-half of
its components function.
(a) For what values of p is a 5-component system more likely to
operate effectively than a 3-component system?
(b) In general, when is a 2k + 1 component system better than a 2k −
1 component system?
SOLUTION: (a) Because the number of functioning
components is a binomial random variable with parameters
(n, p), it follows that the probability that a 5-component system
will be effective is

whereas the corresponding probability for


a 3-component system is
Hence, the 5-component system is better if

33
3. Poisson Distribution
 A r.v. X is called a Poisson r.v. with parameter λ(>0) if its pmf is
given by k
 
P( X  k )  e , k  0,1,2,  , .
k!
 It may be used as an approximation for a binomial r.v. with
parameters (n, p) when n is large and p is small enough so
that np is of a moderate size
 Some examples of Poisson r.v.'s include
I. number of telephone calls arriving at a switching center during various time
intervals
II. The number of misprints on a page of a book
III. The number of customers entering a bank during various intervals of time
IV. photoelectric effect and radioactive decay
V. computer message traffic arriving at a queue for transmission.
 The mean and variance of the Poisson r.v. X
34
Example (Poisson):
Poisson)
suppose that the probability of a transistor manufactured by a
certain firm being defective is 0.015. What is the probability that
there is no defective transistor in a batch of 100?
o Solution: let X be the number of defective transistors in 100. The
desired probability (binomial) is

o Since n is large and p is small in this case, the Poisson


approximation is appropriate and we obtain

which is very close to the exact answer.


o In practice, the Poisson approximation is frequently used
when n > 10, and p < 0.1.
35
4. Geometric Distribution
 This type of event happens when want the number of Bernoulli
trials required until the first occurrence of success.
 If X is used to represent this number, Its pmf is computed to be

 Or when we ask how many times an experiment has to


 The mean and variance of the Poisson r.v. X
be performed until a certain outcome is observed.

36
Example (Geometric):
Geometric)
A driver is eagerly eyeing a precious parking space some
distance down the street. There are five cars in front of the
driver, each of which having a probability 0.2 of taking the
space. What is the probability that the car immediately
ahead will enter the parking space?
Solution:
For this problem, we have a geometric distribution and need
to evaluate it with k=5 and p=0.2.Thus,

37
5. Hypergeometric Distribution
 The hypergeometric random variable arises in the following
situation. We have a collection of N items, d of which are
defective. Rather than test all N items, we select at random a
small number of items, say n < N. N Let X denote the number of
defectives out of the n items tested. We show that

Example : A lot consisting of 100 fuses is inspected by the following


procedure: Five fuses are selected randomly, and if all five "blow" at the
specified amperage, the lot is accepted. Suppose that the lot contains 10
defective fuses. Find the probability of accepting the lot.
Solution: hypergeometric with N=100,d=10, n=5,k=0

38
Exercise: The components of a 6-component system are to be
randomly chosen from a bin of 20 used components. The
resulting system will be functional if at least 4 of its
6 components are in working condition. If 15 of the 20
components in the bin are in working condition, what is the
probability that the resulting system will be functional?
Ans: 0.8687

39
6. Negative Binomial Distribution
 A natural generalization of the geometric distribution is the
distribution of random variable X representing the number of
Bernoulli trials necessary for the rth success to occur, where r is a
given positive integer.
 In order to determine pX (k) for this case, let A be the event that the
first k -1 trials yield exactly r -1 successes, regardless of their order,
and B the event that a success turns up at the kth trial. Then, owing
to independence,

Now, P(A) obeys a binomial distribution with parameters k -1 and r- 1, or

 and P(B) is simply P(B)=p. Finally, we obtain mean and variance

40
Example (Negative Binomial ): )
a curbside parking facility has a capacity for three cars.
Determine the probability that it will be full within 10
minutes. It is estimated that 6 cars will pass this parking
space within the time span and, on average, 80% of all cars
will want to park there.
Solution: the desired probability is simply the probability
that the number of trials to the third success (taking the
parking space) is less than or equal to 6. If X is this number,
it has a negative binomial distribution with r =3 and p =0.8.

41
Some Special Distributions with their Special application
II. Continuous Probability Distributions
1. Uniform Distribution
 When an experiment results in a finite number of “equally
likely” or “totally random” outcomes, we model it with a
uniform random variable
 pdf & cdf of X which is constant over interval (a, b) has the form

Its mean and variance are

42
Example 1 (Uniform Distribution)
owing to unpredictable traffic situations, the time required by a
certain student to travel from her home to her morning class
is uniformly distributed between 22 and 30 minutes.
If she leaves home at precisely 7.35 a.m., what is the
probability that she will not be late for class, which begins
promptly at 8:00 a.m.?
Solution: let X be the class arrival time of the student in minutes
after 8:00 a.m. It then has a uniform distribution given by

We are interested in the probability

43
Example 2 (Uniform Distribution)

In coherent radio communications, the phase difference


between the transmitter and the receiver, denoted by Θ,
is modeled as having a density f ∼ uniform[−π, π].
Find P(Θ ≤ 0) and P(Θ ≤π/2).

Solution:

44
2. Exponential Distribution
 RV X is called exponential written
f ∼ exp(λ ) with parameter λ >
0 if

As λ increases, the height


increases and the width decreases.
It is often used to model lifetimes,
lifetimes such as how long a cell-
phone call lasts or how long it takes a computer network to
transmit a message from one node to another.
It also arises as a function of other random variables. For example

if U ∼ uniform(0,1), then X = ln(1/U) is exp(1).


if U and V are independent Gaussian RVs, then is
exponential & is Rayleigh
Its mean and variance are 45
Example 1 (Exponential)

Assume that the length of a phone call in minutes is an


exponential r.v. X with parameter λ = 0.1.
If someone arrives at a phone booth just before you arrive, find
the probability that you will have to wait
(a) less than 5 minutes, and (b) between 5 and 10 minutes.

46
Example 2(Exponential)
All manufactured devices and machines fail to work sooner or
later. Suppose that the failure rate is constant and the time to
failure (in hours) is an exponential r.v. X with parameter λ.
Measurements show that the probability that the time to failure for
computer memory chips in a given class exceeds l04 hours is .368.
Calculate the value of the parameter λ.
Using the value of the parameter λ determined in part (a), calculate
the time x0, such that the probability that the time to failure is less
than x0, is 0.05.

47
3. Laplace / double-sided exponential
For λ > 0, we write f ∼ Laplace(λ ) if its pdf is

As λ increases, the height increases and


the width decreases.
Pdf of Laplace

Example (Laplace ): An Internet router can send packets via


route 1 or route 2. The packet delays on each route are
independent exp(λ ) random variables, and so the difference in
delay between route 1 and route 2, denoted by X, has a Laplace(λ )
density. Find P(−3 ≤ X ≤−2 or 0 ≤ X ≤ 3).

48
Solution. The desired probability can be written as
P({−3 ≤ X ≤−2}∪{0 ≤ X ≤ 3}).
Since these are disjoint events, the probability of the union
is the sum of the individual probabilities.
We therefore need to compute P(−3 ≤ X ≤−2) and P(0 ≤ X ≤ 3).

Since X has a Laplace(λ ) density, these probabilities are equal


to the areas of the corresponding shaded regions . We first
compute

The desired probability is then

49
4. Cauchy Distribution
The pdf of a Cauchy random variable X∼ Cauchy(λ ) with
parameter λ > 0 is given by

As λ increases, the height decreases


and the width increases.
The Cauchy random variable arises as the tangent of a uniform
random variable and also as the quotient of independent
Gaussian random variables

Find the cdf of a Cauchy random variable X with parameter λ = 1.

50
5. Gaussian or normal distribution
The most important density is the Gaussian or normal. For
σ2 > 0, we write X ∼ N(m,σ2) if its pdf is given by

The density is concave for x ∈ [m−σ,m+σ]


and convex for x outside this interval
As σ increases, the height of the density decreases and it becomes
wider as illustrated in Figure below.
If m = 0 and σ2 = 1, we say that f is a standard normal density.
Its mean and variance are

51
Due to central limit theorem, the Gaussian density is a good
approximation for computing probabilities involving a sum of
many independent random variables. For example, let
X = X1+· · ·+Xn,
where the Xi are i.i.d. with common mean m and common
variance σ2. For large n, if the Xi are continuous random
variables, then

while if the Xi are integer-valued,

Noise current which results from the sum of forces of many


independent collisions on an atomic scale is well-described by the
Gaussian density.
For this reason, Gaussian random variables are the noise model of
choice in electronic communication and control systems. 52
if X ∼ N(μ, σ2), then
is a normal random variable with mean 0 and variance 1.
Such a random variable Z is said to have a standard, or unit,
normal distribution.
distribution Let φ(x) denote its distribution function. i.e

This conversion is quite important for it enables us to write all


probability statements about X in terms of probabilities for Z.
For instance, to obtain P{X < b}, we note that X will be less than b
if and only if (X − μ)/σ is less than (b − μ)/σ, and so

53
Similarly, for any a < b,

φ(x) is computed by an approximation and


the results are tabulated for a wide range
of nonnegative values of x.
Standard normal probabilities.
φ(x) is tabulated for nonnegative values of x only, but we can also obtain
φ(-x) from the table by making use of the symmetry (about 0) of the standard
normal probability density function

For x > 0, if Z represents a standard normal random variable, then

54
Example 1 (normal): If X is a normal random variable
with mean m = 3 and variance σ2 = 16, find
(a) P{X < 11}; (b) P{X > −1}; (c) P{2 < X < 7}.

55
Example 2 (normal): A production line manufactures 1000-ohm
(R) resistors that have 10 percent tolerance. Let X denote the
resistance of a resistor. Assuming that X is a normal r.v. with
mean 1000 and variance 2500, find the probability that a resistor
picked at random will be rejected.
Solution: Let A be the event that a resistor is rejected. Then
A = {X < 900) u {X > 1100). Since (X < 900) n{X > 1100) = Φ, we have

56
Location & scale parameters and
the gamma densities

In the exponential and Laplace densities, λ is a scale


parameter,
parameter while in the Cauchy density, 1/λ is a scale
parameter.
In the Gaussian, m is a location parameter and 1/σ is a scale
parameter. As σ increases, the density becomes shorter and
wider, while as σ decreases, the density becomes taller and
narrower

57
6. Gamma Distribution
An important application of the scale parameter arises with
the basic gamma density with parameter p > 0. This density is
given by

When p = m is a positive integer, gm,λ is called an Erlang(m,λ )


density . The sum of m i.i.d. exp(λ ) random variables is an
Erlang(m,λ ) random variable.
For example, if m customers are waiting in a queue, and the service
time for each one is exp(λ ), then the time to serve all m is
Erlang(m,λ ). The Erlang densities for m=1,2,3 and λ =1 are g1(x),
g2(x), and g3(x) shown below. 58
The gamma densities gp(x) for p = 1, p = 3/2, p = 2, and p = 3.

When p = k/2 and λ = 1/2, gp,λ is called a chi-squared density with


k degrees of freedom.
 The chi-squared random variable arises as the square of a normal
random variable.
In communication systems employing noncoherent receivers, the
incoming signal is squared before further processing.
Since the thermal noise in these receivers is Gaussian, chi-
squared random variables naturally appear.
59
Exercises
1. A binary source generates digits 1 and 0 randomly with
probabilities 0.7 and 0.3, respectively.
(a)What is the probability that two 1s and three 0s will occur

in a seven-digit sequence?
(b) What is the probability that at least three 1s will occur in a

seven-digit sequence?
2. A noisy transmission channel has a per-digit error
probability p = 0.03.
(a) Calculate the probability of more than one error in 10
received digits.
(b) Repeat (a), using the Poisson approximation
3. It is known that the floppy disks produced by company A
will be defective with probability 0.01. The company sells the
disks in packages of 10 and offers a guarantee of replacement
if more than 1 of the 10 disks is defective. What proportion of
packages is returned? If someone buys three packages, what is
60
the probability that exactly one of them will be returned?
4. The radial miss distance [in meters (m)] of the landing point of

a parachuting sky diver from the center of the target area is


known to be a Rayleigh r.v. X with parameter
(a) Find the probability that the sky diver will land within a
radius of 10 m from the center of the target area.
(b) Find the radius r such that the probability that X > r is 0.368.
5. Binary data are transmitted over a noisy communication
channel in block of 8 binary digits. The probability that a
received digit is in error as a result of channel noise is 0.03.
Assume that the errors occurring in various digit positions
within a block are independent.
(a) Find the mean and the variance of the number of errors per block.
(b) Find the probability that the number of errors per block is ≥4
6. Suppose the probability that a bit transmitted through a digital
communication channel and received in error is 0.1. Assuming
that the transmissions are independent events, find the
probability that the third error occurs at the 10th bit.
61
Ans. 0.017
7. A company produces independent voltage regulators whose outputs
are exp(λ ) random variables. In a batch of 10 voltage regulators, find the
probability that exactly three of them produce outputs greater than v
volts.
8. A small airline makes five flights a day from Chicago to Denver. The
number of passengers on each flight is approximated by an exponential
random variable with mean 20. A flight makes money if it has more than
25 passengers. Find the probability that at least one flight a day makes
money. Assume that the numbers of passengers on different flights are
independent.
9. A certain photo-sensor fails to activate if it receives fewer than four
photons in a certain time interval. If the number of photons is modeled
by a Poisson(2) random variable X, find the probability that the sensor
activates.
Answer: 0.143.
10. Ten-bit codewords are transmitted over a noisy channel. Bits are
flipped independently with probability p. If no more than two bits of a
codeword are flipped, the codeword can be correctly decoded. Find the
probability that a codeword cannot be correctly decoded
62

You might also like