Download as pdf or txt
Download as pdf or txt
You are on page 1of 34

*'ffi ffi.

ffie:n
Random Processes

7.T INTRODUCTION
Models for random message signals and noise encountered in communication systems are developed in
this chapter. Random signals cannot be explicitly described prior to their occurrence, and noises cannot
be described by deterministic functions of time. However, when observed over a long period a random
signal or noise may exhibit certain regularities that can be described in terms of probabilities and statisti-
cal averages. Such a model, in the form of a probabilistic description of a collection of functions of times,
is called a random process.

7.2 DEFINITTONS AND NOTATIONS OF RANDOM PROCESSES


)
Consider a random experiment with outcomes and a sample space S. If to every outcome )
e S we
assign a real-valued time function X(t, \), we create a randorn (or stochasyig) process. A random process
X(t, is therefore a function of two parameters, the time r and the outcome ). For a specific ), say, ),,
:
^) a single time functionX(t, ),) xi?). This time function is called a samplefunction. The totality
we have
of all sample functions is called an ensemble. For a specific time t,, X(t,, \): Xl denotes a random vari-
able. For fixed t(: t) and fixed X: ),), X(t,, A,): x,(t) is a number.
Thus, a random process is sometimes defined as a family of random variables indexed by the param-
eter t e 7, where 7 is called the index set.
Figure 7.1 illustrates the concepts of the sample space of the random experiment, outcomes of the
experiment, associated sample functions, and random variables resulting tiom taking two measurements
of the sample functions.
In the following we use the notation X(t) to represent X(r, )).

7.3 STATISTICS OF RANDOM PROCESSES


A. Probabitistic Expressions
Consider a random process X(r). For a particular time ttJQ): X, is a random variable, and its distribu-
tion function,Fr(r1; r,) is defined as
Fx@i /r) : P{X(t) ( ,,} (1.1)
17.2 I
Analog and Digital Communications

Sample space

)r'
o
)z'
o

xlfl)

Fig. 7.1 Random process

where x, is any real number.


And Fr(x1i /1) is calted thefirst-order distribution of X(t).The corresponding first-order density func-
tion is obtained by
oFr@tt)
f*(*r; tr):
^' d*t
(7.2)

Similarly, given /, and t2, X(t) : Xt and X(tr) : Xzrepresent two randorn variables. Their joint distribu-
tion is called the second-order distribution and is given by
Fy(xp x2i tp tr) : P{X(tr) I xt, X(t) < xz} (7.3)
where x, and x2 are any real numbers.
The corresponding second-order density function is obtained by
o2 Fr(xt,xz; tr,tz) (7.4)
frr'r, x2) t1, tr)
- 0xr0x,
In a similar manner, for n random variables X(t,) : XiQ - l, . .., fl), the nth-order distribution ts
F*(xp ..., xn, tt, ..., tn) : P{X(tr) ( -rr, ..., X(tr) I *n} (7.s)
The corresponding nth-order density function is
0" F* (xr,..., xn;tt,...,tn) (7.6)
0xr...0xn

B. Statistical Averages
As in the case of random variables, random processes are often described by using statistical averqges
(or ens emble averages).
The mean of X(t) is defined by
pr|) : EW@|-_ xfr(x; t) dx (7.7)
{:
where ,Y(r) is treated as a random variable for a fixed value of r.
The autocorrelation of X(t) is defined by
Ro(tr, tr) -- EIX(tr)X(tr))

: *'*'f*(x"x2' t1't2)d"dx' (7.8)


"L: .[-
The autocovariance of X(t) is defined by
Co(tr,tz) - E{lX(t)- ttx|)llx(tz)- tx(t))} (7.e)
: Ro(ty t) - trx(tr)t Atz)
The nth joint moment of X(t) is defined by
EIX(I) ...X(t,)l: xl ... x,fy(x1, ...xnt t1, ..., t) dx, ...d*n (7.10)
"[: ,l_:
C. Stationarity
7. StriCt-Sense Stationary A random process x(r) is called strict-sense stationary (SSS) if its
statistics are invariant to a shift of origin. In other words, the process X(l) is SSS if
fr(*r, ..., xr) t1, ..., tr) : f*(*1, ...r xr; t1 * c, ..., t, * c) (7.11)
for any c.
From Eq. (7.1 1) it follows thatf*(xr; t) : -fx@fi t1 * c) for any c. Hence the first-order density of a
stationary X(/) is independent of r:
f*(*r; t) : _fx@) (7.12)
Similarly,f*(xpx2itvtz)-fr(xpx2,t1 *c,t2l c)foranyc. Setting c:-tpweobtain
f*(xpx2, tyt2) :.fx(xvxzitz- t) (7.13)
which indicates thatif X(t) is SSS, the joint density of the random variables X(t) andX(t + r) is indepen-
dent of t and depends only on the time difference z.

2. Wide-SenSe StAtiOnAry A random processX(r) is called wide-sense stcttionary (WSS) if its


mean is constant
EIX(t)) -- trx (7.14)

and its autocorrelation depends only on the time difference r


E[X(t)X(t + r)]: Rxx(r) (7.1s)
From Eqs (7.9) and (7 .15) it follows that the autocovariance of a WSS process also depends only on the
time difference r:
Co(i-Rxr(r)-p'* (7.16)
Setting r - A in Eq. (7 .15), we obtain

Ew2Q)l: Rrr(o) (7.r7)


Thus, the average power of a WSS process is independent of t and equals R
Note that an SSS process is WSS but a WSS process is not necessarily SSS. "(0).
Two processesX(r) and I(r) are called jointly wide-sense stationary (jointly WSS) if each is WSS and
their cross-correlation depends only on the time differdnce r:
| 7,4 | Analog and Digitol Communications

Rn(t, t * r) : EIX(t)Y(t + r)l -- Rrr(r) (7.18)


From Eq. (7.18) it follows that the cross-covariance of jointly WSS X(t) and IZ(r) also depends only on
the time difference r:
C.r., (r): R.ry(r) - Fx l.ty (7.1e)

D. Time Averages and Ergodicity


The time-averaged mean of a sample function x(t) of a random process X(0 is defined as

7 : {x(t)) : ,r1x + I:;:, xl),tt (7.20)

where the symbol (o) denotes time-averaging.


Similarly, the time-averaged autocorrelation of the sample function x(r) is defined as

no1): (x(r)x(/ 1
'))
: [:r:, xU)x(r * r)dt (7.21)
.lTl_ +
Note that i and,Ro(r) are random variables; their values depend on which sample function ofX(r) is
used in the time-averaging evaluations.
If X(t) is stationary, then by taking the expected value on both sides of Eqs (7 .20) and (7 .21), we obtain

Eti): .r{x [:,:, Etu(t))dt: LLx (7.22)


+
which indicates that the expected value of the time-averaged mean is equal to the ensemble mean, and

Et Rxx(r) I : Etu(t)x(t + r)]dt -- RoG) (7.23)


;,*+ [::,
which also'indicates.that the expected value of the time-averaged autocorrelation is equal to the ensemble
autocorrelation.
A random process X(t) is said to be ergodic if time averages are the same for all sample functions
and equal to the corresponding ensemble averages. Thus, in an ergodic process, all its statistics can be
obtained by observing a single sample function x(t) : X(t, \) () fixed) of the process.
A stationary process X(r) is called ergodic in the mean if
7: (x(t)) : EIX(I)]: px (7.24)
Similarly, a stationary proces s X(t) is callecl ergodic in the autocorrelation if
Ro?) : \x(t)x(t + r\ : Elx(t)x(t +
Rxr(r) r)l: (7.2s) .

Testing for the ergodicity of a random process is usually very difficult. A reasonable assumption in
the analysis of most communication signals is that the random waveforms are ergodic in the mean and in
the autocorrelation. Fundamental electrical engineering parameters, such as dc value, root-mean-square
(rms) value, and average power can be related to the moments of an ergodic random process. They are
summarized in the following:

l. 7 : (x(t)) is equal to the dc level of the signal.


2. li l' - (x(t))2 is equal to the normalized power in the dc component.
3. Fr"(0) -- (*'(t)> is equal to the total average normalizedpower.
4. otr : $'(t)l - (*(t))'is equal to the average normalized power in the time-varying or ac aompo-
nent of the signal.
5. o, is equal to the rms value of the ac component of the signal.

7.4 CORRELATIONS AND POWER SPECTRAL DENSITIES


In the following we assume that all random processes are WSS.

A. Autocorrelation Ro(r)
The autocorrelation of X(t) is [Eq. (7.15)]
RoG): E[X(tV(t + r))
Properties of RoQ)
l. Ro(r) : Rxx(r) (7.26)
2. lRo?)l S Rrr(0) (7.27)
3. R r(0) -nlxz(t)l (7.28)

B. Cross-corretation Rrr(r)
The cross-correlation of X(r) and I(r) is [Eq. (7.18)]
RnG): Elx(t)Y(t + r)l
Properties of RnQ):
l. R*r(-r) - Ry*(r) (7.2e)
2. lR r(r)l < JR""(o) R r(0) (7.30)

3. ln,,,(r)l < f,W*(o) + Rrr(o)l (7.3r)

C. Autocovariance Co(r)
The autocovariance of X(t) is [Eq. (7.9)]
cok) + r) -Etx(r + r)l}l (7.32)
:3::l_rn!:urn{x(t
D. Cross-covariance Cxy(r)
The cross-covariance ofX(r) and I'(r) is
cnt) {Y(t * r) - EtY(t + ')l}l (7.33)
- iffit)t)
TWo processes X(t) and Y(t) =X#?
are called (mutually) orthogonal rf
RnQ) - Q (7.34)
They are called uncorrelated if
Cn(r) - g (7.3s)
Analog and Digital Communications

E. Power Spectral Density or Power Spectrum


Let Roft) be the autocorrelation of X(t). Then the power spectral density (or power spectrum) ofX(r) is
defined by the Fourier transform of R**(r) (Sec. 1.3) as

so@): Ro(r)e-i* dr (7.36)


"[:
Thus, Ro?): t*(w)ei" dw (7.37)
* {*_
Equations (7.36) and (7.37) are known as the Wiener-Khinchin relations.
Properties oJ'So@):
1. So,(c,,,) is real and S_o.(c,.,) > 0 (7.38)
2. So(a) : Srr(o) (7.3e)

3. +
2r Jr-
-,o
So(a)da: Rrr(0) - Elx|)l (7.40)

F. Cross Spectral Densities


The cross spectral densities (or cross power spectra) S^r(r) and Srr(u.,) ofx(r) and Y(t) are defined by

Sn@): Rn(r)e-i" dr (7.4t)


I:
and Syx(o) : Ryx(r)e-io'dr (7.42)
[:
The cross-correlations and cross spectral densities thus form Fourier transform pair. Thus,

R*r(r) : 1rI*_ t*(a)eio" da (7.43)


+
Rr*(r):
1^\ ' + f*
2r J-o
sr*(a)ei"da (7.44)

Substituting the relation lBq. (7.29))


Rn?) -- Rvx?r)
in Eqs (7.41) and (7.42), we obtain
Sn@): Sr-(-o) : S.rx(cu) (7.4s)

7.5 TRANSMISSION OF RANDOM PROCESSES


THROUGH LINEAR SYSTEMS
A. System Response
A linear time-invariant (LTI) system is represented by (Sec. 2.2)

Y(t)
- LWt)1 (7.46)
Random Processes

where I'(r) is the output random process of an LTI system, represented by


a linear operator I with input random process X(t). x(t)
Let h(t) be the impulse response of an LTI system (Fig. 7.2). Then
[Sec. 2.2, Eq. (2.8)]
Fig.7.2 LTI system

Y(t)
- h(t) * x(0 : h@)x(t - a)da (7.47)
[:*
B. Mean and Autocorrelation of the Output

py(t)
- EtY(t)l: ul[:h@) x (t - a) d,,f
: I: h@)E[x(t - a)]da

h@)p*(t - a)da - h(t) * t'*u) (7.48)

n,,6,, t,) : nvaxr{}


- EI [* f*
[u -x .r --x '@)
X (tt - o)h({3\ x (tz - il a(,ag]
]

: ,f")h(0)Elx(t, - a)x(tr- gdadf,


[: f*
r* f* h@)h6)Rrdtt - ct, tr- 0))dadp (7.4e)
- J -* J-oo
If the inputX(rl is WSS, then from Eq.(7.a8) we have

EIY(t)l: h(a)p*da: r<")da - pxH(o) (7.s0)


I: Fx
f*
where H(0) is the frequency response of the linear system at r, : 0. Thus, the mean of the output is a
constant.
The autocorrelation of the output given in Eq. (7.49) becomes

Rry(t1, D : r*")h(ORr*(tz - tt t a - Adade (7.st)


[: f*
which indica-tes that Rrr(tr, tr) is a function of the time difference r: t2- /,. Hence,
Rrr(r): nf")h(AR*x(r * a - g)dad7 (7.s2)
"L: f*
Thus, we conclude that if the inputX(r) is WSS, the outputY(t) is also WSS.

C. Power Spectral Density of the Output


Taking the Fourier transform of both sides of Eq. (7.52), we obtairi the power spectral density of the
output
| 7.8 | _ Anolog and Digitol Communications

s,z(r) : Ro(r)e-i'" dr
I:
:
I: t: f- h@)h($Ro7 * a - iJ)n-'*'drdada

- lH(w)l2S.,x(o) (7.53)
Thus, we obtained the important result that the power spectral density of the output is the product of the
power spectral density of the input and the magnitude squared of the frequency response of the system.
When the autocorrelation of the output Rrr(r) is desired, it is easier to determine the power spectral
density Sn@) and then to evaluate the inverse Fourier transform (Solved Problem 7 .21). Thus,

Rrr(r) t**a1d" da
= * Il*
: * f_ tu1")12 s*r(dei" d, (7.s4)

By Eq. (7.17), the average power in the output I'(r) is

rgt'g11- Ryy(O) : tat solaldut (7.ss)


* f_ )12

7.6 SPECIAL CLASSES OF RANDOM PROCESSES


In this section, we consider some important random processes that are commonly encountered in the
study of communication systems.

A. Gaussian Random Process


Consider a random process X(t), and define n random variables X(r,), . .., X(tr) corresponding to n time
instants tp ..., tn.Let Xbe a rundom vector (n x I matrix) defined by

x-lIxt'': l] I
(7.s6)
lx(t'))
Let x be an n -dimensional vector (n x I matrix) defined by

,:[rl (7.s7)
lx,l
so thatthe event {X(t) 1*r,...,X(t,) l xn} is written {X < x}. ThenX(r) is called agaussian (or
normal) process ifXhas a jointly multivariate gaussian density function for every finite set of {r,} and
evety n.
The multivariate gaussian density function is given by

f.(x)--
fu"^e[- )e- tDrc-'(^-p)] (7s8)

where Zdenotes the "transpose", p is the vector means, and C is the covariance matrix, given by
.. Random Processes f ?.r]
,

Ip,I Ir'txtr, t]l


p-Elx:l:l:l ' I
(7.se)
lr) lawu,n)
[c,
c-1...
c,,1
...1 (7.60)

lr,, : , .,)
where Q - Co(t,, tr) Rxrlt, t,) - Fipt (7.6t)

which is the covariance of,X(r,) andX(t), and det C is the determinant of the matrix C.
Some of the important properties of a gaussian process are as follows:

l. A gaussian process X(r) is completely specified by the set of means


Fi: EIX(I,)) i
- l, -.., n
and the set of autocorrelations

Rw(ti, t,) - EIX(t,)X(t)) i, j : l, . . ., n


2. If the set of random variables X(t,), i - l, ..., fr, is uncorrelated that is,
,,r:o i=j
then X(t,) are independent.
3. If a gaussian process X(r) is WSS, then X(r) is SSS.
4. If the input process X(t) of a linear system is gaussian, then the output process I(r) is also gaussian.

B. White Noise
A random process X(r) is called while noise if
[Fig. 7.3(a)]
so@): (7.62)
Z
Taking the inverse Fouriertransform ofEq. (7.62),
we have
Rr*(') : Zur", (7.63) o
(a)
i'

which is illustrated in Fig. 7.3(b). It is usually Fig. 7.3 White noise


assumed that the mean of white noise is zero.

C. Band-timited White Noise


A random process X(/) is called band-limited white noise if

Sa.(c,,,)
- Z luls,.n (7.64)
o lrl> a,
[ 7.10 |
II
Analog and Digital Communications

Then

R--(r\- 1 r'' lejo'da-T" sin"'''r


(7.6s)
M\ 2rJ-ru2 2r daT

And Sr(c,',) and R*Ar) of band-limited white noise are shown inFig.7.4.
Note that the term white or band-limited white refers to the spectral shape of the processX(r) only, and
these terms do not imply that the distribution associated with X(r) is gaussian.

(a) (b)

Fig.7.4 Band-timited white noise

D. Narrowband Random Process


Suppose that X(t) is a WSS process with zero mean
and its power spectral density S*Ar) is nonzero only in
some narrow frequency band of width 2W that is very acu
-ac 0
small compared to a center frequency u)c, as shown in
Fig. 7.5. Then the process X(r) is called a nqrrowband H2W
H
2W
random process.
Fig. 7.5 Narrowband random process
In many commqnication systems, a narrowband
process (or noise) is produced when white noise (or broadband noise) is passed through a narrow-
band linear filter. When a sample function of the narrowband process is viewed on an oscilloscope,
the observed waveform appears as a sinusoid of random amplitude and phase. For this reason, the
narrowband noise X(r) is conveniently represented by the expression
X(t) - V(t) coslu,t + $(t)) (7.66)
where V(t) and $(t) are random processes that are called the envelope function and phase function,
respectively.
Equation (7 .66) can be rewritten as
X(t) V(t) cos /(r) cos a"t - V(t) sin $(t) sin u"t (7.67)
-
: X"(t) cos a,t - X,(t) sin a"t
where X,(t): V(t) cos $(t) (in-phase component) (7.68a)
x,(t) - V(t) sin 0(t) (quadrature component) (7.68b)
v(t) - xl Ql + x! 1tl (7.6ea)

6(t\ -tan-r 52 (7.6eb)


x'(t)
Equation (7.67) is called the quadrature representation of X(t). Given X(t), its in-phase component
X,(t) and quadrature component X,(t) can be obtained by using the arrangement shown in Fig. 7.6.
Properties of X,(t) andX,(/):
l. X,(t) and X,(r) have identical power spectral densities related to that of X(r) by

- * so@ * a') l'l<w


- &,(ur) : It)o(
@
sr.(r) "d') (7.70)
[O otherwise
2. X"(t) andX,(t) have the same means and variances
as X(t):
l'*r: l'xr: Fx: 0 (7.71)
222
6x.:ox":ox (7.72)
3. X"(t) andX,(t) are uncorrelated with each other:
E[X"(t)X,(r)] : 0 (7.73)
4. lt X(t) is gaussian, then X"(t) and X,(t) are also
gaussian.
5. If X(t) is gaussian, then for fixed but arbitrary t,
V(t) is a random variable with Rayleigh distribu- -2 sin ..,,"t

tion and $(t) is random variable uniformly dis-


Fig. 7.6
tributed over [0, 2zr]. (See Solved Problem 6.39.)

Random Processes and Statistics of Random Processes

' i.l :Consider ft xqrSgl1r,process (r) given by

X(t): '.4cos (a'rr + O) (7.74)


urhere A frfid#,apo.,gon$tffits,,qqd O is a uniforrn random variable over [-zr, n]. Show thatJt(r) is

From Eq. (6.105) (Solved Problem 6.23), we have


I _ r -0 -r
fo(0): 2r
0 otherwise
Thus, tr*(t) - E[X(t)]: [* Ocos (c,.r/ + 0Yo@)d0

: u f" cos(c.,,r/ *0)d0-0 (7.7s)


2rJ -n
R*At, t * r) : E[X(t)X(t + r)]

-!'nr cos (c"r/ * 0) cos la(t + r) + 0ld0


*J _"
Analog and Digital Communications

: c,.'r * cos (2wt * 20 + ar)]do


* [:-;[cos
: A,2 cos uJI (7.76)
2
Since the mean of X(t) is a constant and the autocorrelation ofX(r) is a function of time difference
only, we conclude that X(r) is WSS.
Note that Ro(r) is periodic with the period To: Znlw. A WSS random process is called peri-
odic if its autocorrelation is periodic.

X(yt):Acos(wt+0) {7.77)
uflrere a, and S afe c #raild l is a random variable. Detenmine vrhether X(r) is WSS.

,xQ)::f\',;;:;(wt+o)t
(77s)
which indicates that the mean ofx(r) is not constant unless EIA) - 0.

RoG, t * r) : EIX(tV(t + r)l


: ElAzcos (r,.,/ *
i + 0))
0) cos[u(t +
I
- 2 1ro. ur * cos (Zwt + 20 + ur\lT[Az7 (7.7e)

Thus, we see that the autocorrelation of X(t) is not a function of the time difference z only, and
the process X(r) is not WSS.

7.3 Consider a raridom process I(r) defined by


Y(t1
- Jof' x1r1a, {?.s0)
whete Xf#) :is, :

gitffi b$
r,

X(t1 - Acos wt (7.st1


where # is cbus,lq+t qqd A - if [0; /].
(a) Determine the pdf of Y(t\ at t : t*.
(b) Is r(r) wss?

s-n94-
(a) Y(to1
- I'r ocos urdr - I (7.82)

Then from the result of Solved Problem 6.32, we see that Y(to) is a gaussian random variable
with
EIY(tk\ -
sin at o
ElAl : g (7.83)
a
/ . 12'
and y: varly(to)l utr (7.84)
02
- 1.,)
lsn 1 oz
Hence, by Eq. (6.91), the pdf of )'(r) is

I
-fv()) '-t'r{zoi)
(7.8s)
i /.7t oY

(b) From Eqs (7.83) and (7.84), the mean and variance of )z(r) depend on time r(r), so I(r) is not
WSS.

7.4 Consider a random processX(r) grven by


(7.86)
where i.r,i*.Cr*n .md # and$ are:fandom variables.
(a) Show that the condition
EtAl: E1BT * 0 (7.87)
is necessar,y forX{r) to be stationary.
(b) Show that X(r) is WSS if and only if the random variables .,{ and B are uncorrelated with
equal variance, that is,
EIABI -0 (7.88)
EV\: ElB?l: 02 (7.8e)

(a) t-LAt) - EIX(I)) * ElAlcos wt I E[,B]sin a.r/ must be independent of r for X(t) to be stationary.
This is possible only if p,At) 0, that is,
-
EIA): ElBl - 0
(b) It X(t) is WSS, then from Eq. (A7. l7)

E(x21011: "
'll"n'flz"" ll)l : Rxr(o) - d,
But x(o) - tr and *l+) : ,
lz, )

Thus, EIA'\: EIB'y - o'*: 02

Using the preceding result, we obtain


R**(t, t * r) : EW(tV(t + r)l
: El(A cos Ld, * B sin wt)[A cos cu(r * z) + -B sin a(t + r)1]
: 02 cos (rr + ElABlsin (2wt * wr) (7.e0)

which will be a function of z only if E\ABI: 0.


Conversely, if EIAB) - 0 and EIA'I ElB2): 62, then from the result of part (a) and
-
Eq. (7.90), we have

-o
Fx(t)
Ro(t, t * r) : ; cos 1,/T
- Rxx(r)
Hence, X(r) is WSS.
. | l.ti I Analog and Digitol Communicotions

7.5 A random process X(f) is said to be covariance-stationary if the covariance of X(r) depends only
on the time differer"", - tz f,, that is,
co(t,t * r) - cxx(r) (7.91)

Let X(t) be given by


X(t) - (A + 1) cos t * B sint
where A and B are independent random variables for which
ElBl - o and EIA'I -- ElB27 -
ElAl: 1

Show that X(t) is not WSS, but it is covariance-slationary.


p,At) EIX(t)l: E(A * 1) cos t + B sin r)l
:- EIA t 1l cos t * ElBl sin t
:COS/
which depends on /. Thus, X(t) cannot be WSS.
' Rxx(tr, tr) : EIX(t)X(t)l
: Ell(A * tt + B sin r,l[(l *' l) cos t2'+ B sin rr]l
1) cos
: EL(A + )'l I cos /, cos /1
+ EIB '1 sin /, sin /,
+ El(A -r 1)Bl (cos /, sin r, * sin /r cos /2)
Now El@ + l)'l : ElA2 + 2A + ll : EIA') + 2ElA) * I :2
El@ + l )Bl -- El4Bl + EIB\ - EIA)EIBI + ElBl -- 0
ElB21
- 1

Substituti'*'n";:1,::,'l:'::::1T:Ir.'lrl"u;ll';,weobtain

: cos (tz- t) * cos /r cos /2

From Eq. (7.9), we have


Co(tr, tr) - Ro(tr, - p*(tr)t *(tr)
tr)
: cos (tr- t) * cos /t cos /2 - cos tt cos tz
_ cos (tz- tr)
Thus, X(t) is covariance-stationary.

7.6 Show that the processX(r) defined in Eq. (7 .74) (Solved Problem 7.1) is ergodic in both the mean
and the autocorrelation.

From Eq. (7.20), we have

7: (x(/)) : lll + I::rr cos (*:t-t o)dr

- + J[':'
-T"r2
cos (a.r/ * o) dt -o (7 .s2)

where To:2trlw
Random Processes

From Eq. (7.21), we have


RoG) : @Ox(t + r))

_ hrr, 1 fr t2 Az cos (ut * 0) cos la(t + r) + \ldt


T-cp T J -rtz
A2 frot2 I r-
- [cos ar * cos(Zat + 20 * wr)]dt
r;L J -r,,r;
/2
: " COS ur7 (7.e3)
2
Thus, we have
pxQ)
- EIX(I)) - (x(r)) : 7
RoG) - EIX(t)x(t + r)l - (x(t)X(t + r)) R,'7)
Hence, by deflnitions (7.24) and (7.25), we conclude thatX(r) is ergodic in both the mean and the
autocorrelation.

Correlation and Power Spectra


-t

(?.e4)
:

wlrere Rtr) is the,autoeorrelation ofX(r).


Using the linearity of E (the expectation operator) and Eqs (7.15) and (7.17), we have

Ellx(t * - x?)l2l : Elx'(, + r) - 2x(t + r)x(t) + x21t11


.r)
- Elx2(t + r)l -2Elx(t + rvl)l + Elx21t1l
: Rxx(0) - zRxxQ) + Rxr(0)
- 2lRxx(O) - Ro(r)l
:',.i::t '.ili.u .,. , ,: '. : .:,
,
i'.r7;# l,c1.ffiffiix;ffiiffifidompr*eess,Ysify Eqs {7.26) und, Q.27},rhatis,l ',

'r(a):,iil{$ +i #,(*},,, if'. .

(a) From Eq. (7.15)


RoG)-E[x(tw(t+r))
Setting t * r: /,we have
Rok) - EfX(t' - r) X(tt))
- EIX(I'VQ'- ?)l - Rxx(r)
(b) E[lx(t)*x(t*r)1'] >o
olx21t1 r 2xQ)x(t * r) + x21t+ r)] > o
Analog and Digitol Communicotions

or n[x21ty)+zrfxgyxlt 1 ,)l + elxz1t + r)] 2 0


or 2Ril{(0)*2Rok)>0
Hence, R 2 lRxx(r)l
"(0)
?.9 Shorv that,ftepous-er s of a (real) random processX(r) is real, andverifu Eq. (7.39), ttrat is,
Sxx?w) : Sxx(u)

From Eq. (7.36) and by expanding the exponential, we have

s*x@): [: R.u(r)di" dr

: Ro(r)(cos ar -7sin wr)dr (7.es)


I:
: Ro(r)cos wrdr -i [-_ R r(z)sin ardr
[:
Since Rxx1i : Rxr(r) lEq(7 .26)1, the imaginary term in Eq. (7.95) then vanishes and we obtain

So(r) : Ro(r) cos ardr (7.e6)


[:
which indicates that So(u) is real.
Since the cosine is an even function of its arguments, that is cos (-wr)_ cos ar,rt follows that
So(-) Sxx(a) -
which indicates that the power spectrum of X(t) is an even function of frequency.

7.10 A class of modulated random signal I'(l) is defined by

Y(t)
- AX(t)cos (w"t * Q) fi.97)
whereX4 isthe',iandom message signal andlcos (a,t * @) is the carrier. The random message
signal Xiil is a zero-m€an stationary random process with autocorrelation R,or(r)_and power
qpectr-,uql fi*(al)r C i:,amplitude ,4 asdthe frequency uc are constants, and phase O is a
ranOom v -iable lmiforrnty distributed over 10, }tl. Assuming that X(r) and O'are,independent,
find the firean, autocorrelation, and power specffum of f(r).

tty(t) -_ElY(t)): EIAX(1) cos (r.,r,/ + O)l


- aElX(t)lE[cos (w"t * O)] - O
since X(t) and O are independent and EIX(I11: g.

Rn(t,t * r): EIY(t)Y(t + r))

- E(A2X(t)X(t + r) cos (u"t * O) cos lw"(t + r) + Oll


,2
A EIX(t)X(t * r)lE[cos u)cr * cos (Zw"t * a,r + 2O)]
- 2
,2
: A Roft)cos ucT : Ryv(r) (7.e8)
2
I tul
since the mean of Y(t) is a constant and the
difference r, y(t) is WSS. Thus,
autocorrelation of y(t) depends only on the time

t2
Srr(r) - a?* lRw!)l -= rtr fRrrg) cos,,,.rl
;
By Eqs (7.96) and (7.98)
,rf*fRyy(r)l: Sr*(r)
aT (cos w,r) _ r6(w _ a,) * r6(w * ,,)
Then, using the frequency convolution
theorem (1.29) and Eq. (1.36), we obtain
,2
Srr(r) : 1 So@)
4n
* ltr6(a _ a,) * n6(w * a,)l
.^
A2, _
- 4
l,Sxx@ d")

* So(w -f ,")l (7.99)


Fig. 7.7 Telegraph signal
7'I1 consider a ra.ndom process ,ry,
{!? in T.ruT.:
sample function ofx(4 is shoum
hat values * r with equal probabil ity.Affiical
Fig. 7.7.Theaverage number of polarity switches (zero
' crossings) per Enit time is o. Theprobabiiiry of
having t.rorsings ; ttr"";i-^*,r-, u,
:the Poisson distribution
[Eq. (6.gg)] "iutty
P{Z * kl: (o')o
^-ar ., (7.100)
kl.
where z is the random variable representing the number of zero crossings. The process
knot*n.asthe tafu tiffi., Find the aut&o.rrutoi;#;#r specrrum
x(r) is
ofx{r).
If r is any positive time interval, then
Ro(t,t * r) _ EIX(tv(t + r)l
-,,q2p[X(t) and X(t +^r) have same signs]
+eAlplx(t) and ktr-* r) have different signsl
- A2plZ even in (t, t + r)l _ *ryzodd in 1t, t + ,y]
: A2 D ,-^, @f: - A2I e-o, (or)o
even
k kl kl #r-
: A2e ", j\ (or)o (-l)o
/-t t_ t
k=O n!

: f2"'ar i q{ : A2do,e-ar
frkl - tr2"*2ar (7.101)
fTf.ie I Anolog ond Digital Communications

which indicates that the autocorrelation depends only on the time difference r.By Eq. (7 .26), the
complete solution that includes z ( 0 is given by

R*rt) - tr2" 2alrl


(7.r02)
which is sketched in Fig. 7.8(a).
Taking the Fourier transform of both sides of Eq. (7.102), we see that the power spectrum of
x(/) is [Eq. (1.ss)]
Oo
S',o(r) :A2 , (7.103)
w. * (2a\
which is sketched in Fig. 7.8(b).

Ro?\
42
42g-zalrl

0 z0a
(a) (b)

Fig. 7.8
?:t2 Conbider fl raleom ttrury prccess X(r) consisting of a random seqxence of binary syrrUot f and 0,
A'ffii l..$a-mglgi.S it#ffii$'s ,in:,Fig.,.,?.9. ft ii assumed that

;i, il,"'ffi'$ffiIs':l.,affi1merr cc*ffibypdlscqof ffiffi,#$Fcgtivety, *d


, 2. T.he two syrrybols I and 0 are equally likely, and the presencs of a I or 0 in ary one intqryal is

3. The putrsersq$rffii* uot,synchronized, so that the starting time r, of the fir,st,pqlpe afur r * 0

Find the autocorrelation and power spectrum of X(t).

The random binary process X(t) can be represented by


oo
X(t)- D Ar,pQ-kTu-To) (7.104)
fr=-m
where {lo} is a sequence of independent
random variables with P[Ao
- Al - Pl,Ak
-
:
- Af
), *l
is a unit amplitude pulse of
duration 76, and Tois arandom variable uni-
formly distributed over [0, TA.
px(t)-E[x(t)]-E[Ai
l(-,r)-o
-!e+
22
(7. l 0s)

Let t, ) tt.When tz - tt ) 76, then r, and


t2 must fall in different pulse intervals
[Fig. 7.10(a)] and the random variables X(tr)
andX(tr) are therefore independent. We thus
have

Ro(tr, - EIX(tr)X(t)): EIX(tr))


tz)

EIX(tz)l:0 (7.106)
When tz - tr I 76, then depending on the
value of T6 t, and t, may or may not be in
the same pulse interval [Fig. 7.10(b) and
(c)]. If we let B denote the random event "/,
and t, are in adjacent pulse intervals," then
we have
(c)
Ro(tr, tz)
- E[X(ttV(tr)lB]P(B)
Fig. 7.10
.+ Elx(t,vu)lE lptF>
Now Elx(t tv(t)lBl Elx(t ))Elx(tr))
- - s
Elx(tt{(DlEf : A2
Since P(B) will be the same when t, and t,rfall in any time range of
length Tu, it suffices to consider the case 0 < t < 76, ?s shown in
Fig. 7 .t 0(b). From Fis. 7 .10(b),
-Tb 0
P(B)-P(trlTa<t) (a)

L
f" dt, : tz- tr
f ,," fro{r)dto - J,, Tb " Tu

From Eq. (6.4), we have

1- P(B) - I -
tz- tt
P@) - Tb

Thus, Ro(tr, tz) :


\'t' -'/"t ,'1, -r' - I : Rxx(r) (7.r07) _2tr 0 2n e
t^ Tb" ) Tb Tb
where T: t2- tr. (b)
Since RoGr) : Rxr(r), we conclude that Fig.7.11
Analog and Digital Communications

Rx*(,) -l^'['- ?] ".t<ro (7.108)

[0 lrl> ro

which is plotted in Fig. 7.ll(a).


From Eqs (7.105) and (7.108), we see thatX(r) is WSS. Thus, from Eq. (7.36), the power
spectrum ofX(r) is

so4t) - A2Tt[sin Gtrul2l)2 (7. l0e)


I uTut2 l

which is plotted in Fig. 7.11(b).

7.I3 LetX(t) and I(r) be WSS random processes Verif,r Eqs (7.29) and (7.30), that is
(a) R : er"(d___-,,__
"(-r)
(b) 1ft.,o,(?)1, <{E(o) ft*(o)
(a) By Eq. (7.18)
R*, (-r) : EIX(t)Y(r - r))
Setting t - T : tt, we obtain
Rn(r): Elx(t' + r1Y1t')l: E\Y(tt)x(tt * r): Rr*(r)
(b) From the Cauchy-Schwarz inequality Eq.(6.148) (Solved Problem 6.52) it follows that

{Elx(t)Y(t + r)l}' < elx' (t)lElY2 (t + r)l


or lRn?)l's Rrr(O)R"r(0)

Thus, lR*, (")l < JR"" (0) &, ( 0)

1.14 LetX(t) and I(f)'be both uero-mean and WSS random processes. Consider the random process
Z(r) defined by
z{t} : x{t} + Y(t) (7.1 10}

(a) Determine the autocorrelation and the power specffum of Z(t) if X{t) and I(r) are jointly
WSS.
(b) Repeat part (a) if X(r) and f(r) are orthogonal.
(c), Show that ifX(r) and Y\t) are orthogonal, then the mean square of Z(t) is equal to the sum of
the mean squ.ares ofX(r) And f(r).

(a) The autocorrelation of Z(t) is given by


Rrr(tr, tr) - E[Z(t,,)Z(tr)]
: EllX(tt) + r1r,;1[X(t) + Y(t)]l
: Elx(ttv(t)l + E[Y(t)Y(t)] + EIY(ttV(Dl + EIY(t)Y(t)l
- R*s.(tp t) + Rn(tr, tr) + Ryx(tb tr) + Rn(tr, tr) (7.111)
It X(t) and Y(t) are jointly WSS, then we have
Rrr(r) - Rxx(r) 1- RxyQ) * Ry*(r) * Rrr(r) (7.1t2)
where r: tz- tr
Taking the Fourier transform of both sides of Eq. (7.112), we obtain
Srz(r) : Sxx(u) + Srz(r) * Syr(i,,,) * S yy(a) (7. I l3)
(b) IfX(r) and Y(t) are orthogonal [Eq. (7.34)],
R.nG):Rrx(r):0
Then Eqs (7.112) and (7 .113) become
Rrr(r)-Rxxt)*Rrr(r) (7.114)
and Szr(r) - Sxx(,t) * Syy(c..,) (7.1 r s)
(c) From Eqs (7.1 14) and (7 .17),
Rzr(O): Rrr(0) + Ryr(O)

E(xz(t)) + EV2O))
aV2Q)1
- (7.1 l6)
which indicates that the mean square of Z(t) is equal to the sum of the mean squares of X(t)
and Y(t).

7.15 Two random processes X(r) and IZ(r) are given by


X{t):,4cos: (at
O) * {7.tt7a)
Y(t1 -lsin (wt + O) (?.117b;)
where A aqt w &re constants and O is a uniform random variable over {0,2r1. Find the cross-
correlation ofX(r) and I(r), and verifu Eq. (7.29).

From Eq. (7.18), the cross-correlation of X(t) and I(l) is


RnU, t * r) : EIX(t)Y(t + r)l
: ElA2cos (a,r/ * O)sin fw(t + r) + Oll
: A E[sin (Zat * wr * 2@) - sin (-c.,,r)]
,2

2
A,2 _ Rn(r)
- sin a,,r (7.1 18a)
2
Similarly, Rrr(t, t * r) - E[Y(tV(t + r)]
: E72sin(wt * O) cos fw(t + ?) + O]l
: A E[sin (2ut * ar *
,2
20) + sin (-c,.,r)] (7.1 l8b)
2
I,2 sin (rr: Rr*(r)
2
From Eqs (7.1 l8a) and (7 .1 18b)

R,.Gi - A'sin c,.r(-r) - - A'sin c,,,r :'Ryx(r)


22
which verifies Eq. (7.29) ,
t--__--_:1
I Trr l Analog and Digital Communications

e*ffi ffi #r ffild$ffi.ffi


iiffi*f: *;*;,;,, * Bsin rur (7,i,I,{9a}.

i.1'1,,,i.i,ffi,#|iiiffi; 1,.ffiir ,are independent dom.+,flri les,,$othlh zero.,m@ffi,


- l,
, ,,fuance
d, Fffi.the msss*oprrelation ofX(/) and i(r).
The cross-correlation ofX(r) and )/(r) is
Rn(tr, tz)
- E[X(t)Y(t))
: El(Acos cul1 * B sinwt r)(B coswt r- Asin u,rrr)l

EIABI(cosc{r/r cosatr-sincu/, sin.,ltr) - Ev\cosc,r/,


- sin wt, + E[B1sin u,r/, cos atz

Since EIABI- E[A]ElBl : 0 EIA'I - ElBzl: 02


we have Rn(tr, tz) - o'1tin a,r/, cos utz - cos u/l sin wtr)
, : o2 sin w(t, _ tr)
or Rn(i : - 02 sin c.,,2 (7.120)
where r: tz- tr

Transmission of Random Processes through Linear Systems

,ih{t}*3e4u{t},,Findthe:rn,ean""iueortueoutp,utY(r}ofthesystemifEt,t(')].gf...
,r r: (1.53), the frequency response l(r.,r) of the system is

H(a)-af th(t)l- 3 . !=
1ut*2
Then, by Eq. (7.50), the mean value of I(r) is

py(t) - E[Y(t)) - lrrH(o) : 2[;J :,

iii:f ,,'i iiffii#irut


',':':
iu# ffi#$t$ffi:wlft:iffis.erespon$e#ffi; #tr)is, Ii$H*qi+,ffiI
1r:
Iihowthat
. at :^t :
.' :.:. t:.:;). :: :.:: ): _ :::,_t:: t;.

**[rri# #iffiIffi
,..,..-:=..,..,-.,..,;j.]..]...].,
(fi,,a ] n]ds t:,,.,,',:l,,l;., .,,:i:l,::,:':

,., ,,::;
,':{#;;X
*}
ll. ::l il:.,

; :: t:.:.: ::
):: :.t.

iirim*
-l:irffi ,.:.,; . ,,',,l.,,':'',. ,,::]]]

i i,##uil f *iaix*: frt t * q, t,)da '. :,,,,',;,,,',,t{


;,1"22J
.u.,,iffi ,
(a) Using Eq. Q.a7), we have
Rn(tr, tz) - E[X(t)Y(t)]

tlx1t,)-[* h@) x (t, -adp1


= L
Random Processes

n h(p)Etx(t )x(t2 - p)jd 0

E h(C)R.w(tp t2- p)dp

(b) Similarly,
: Rry(ty tz) : EIY(tr)Y(tr)l
[ ..oo I
= u X (\ - a) daY(t))
lJ _ _h(*)
: h@)Elx(t' - a)Y(t')ldo
I:*
' [:* h@)Ro'(tt'- a' tr)da

?;*g Let t} &.4 WSS random input process to an LTI systern with impulse response h{t), and let Y(r)
be the corresponding oulput process. Show that
(a) Jts(r) : h(r) * Rry(r) 17.tz3)
o) no(") : n(4 * Rn(r)' (7.r24)
G),Sxy(c.r) ; r1(r)So(r) (7.12s)
(d) syy(r) * H*(w) sn@) (7.126)
**orn * affisifll$ Il+fio.q and H*(ulis the complex conjugate of H(w).

(a) lf X(t) is WSS, then Eq. (7 .l2l ) of Solved Problem 7. 18 becomes


' . Rn(tr, ,tp)RoQz- tt - ildp (7.127)
tr)
- f*
which indicates that R r(tp t2) is a function of the time difference r: tz - tt only. Hence,
Eq.(7.127) yields
Rn(r): h@)Rxx(r - p)dp - h(r) * Ro(r)
"[:
(b) Similarly, if X(r) is WSS, then Eq. (7.122) becomes

Rn(tr, - I]*
tz) rr")Rruez- tr i a)da

or Rn(i: h@)Rn(r * a)da - h(-r) * Rn(r)


"[:
(c) Taking the Fourier transform of both sides of Eq. (7.123) and using Eqs (7.41) and (1.28), we
obtain t

(d) similarty, taking the Fourier *I?;#lkYles or Ect 0.124) andusing Eqs (7.36),
(1.28), and (l .21),we obtain
S,r(r) - H*(u)S,'(u)
It."-'ll.z+ I nnalog and Digital Communicotions

Note that by combining Eqs (7.125) and (7.126), we obtain Eq. (7.53), that is,
Sr, (r) : H*(a)H(u)So@) : lH(a)12 S.u@i)

i.20 LetX(t) andf(r) be the wide-sense stationary randominput proces: Td ralfom output process'
respectively, of a quadrature phase-shifting filter {, xlT rad phase shifte-r of,Sec.,.2,6}, Show that
(a) R r(r):Ryy(r) (7.128)

(b) ftxy k) - nxr(rJ {'1.129)


where Ro{r) is the Hilbert transform of Ro(r).

(a) The Hilbert transform *1t1 of X(r) was deflned in Sec. 2.6 as the output of a quadrature
phase-shifting fllter with

hQ): : H@): -7 sgn(r,.,)


nt
Since lH!w)12 : [, we conclude thatif X(t) is a WSS random signal, then I(r) : X(t) and by
Eq. (7.s3)
Srr(r): lH(a)lzSo(r) : So@)
Hence, Rry(r): 17- '[Srr(r)] : rir' -'[So(r)): Rxx?)
(b) Using Eqs (7.123) and (2-26), we have
l^ *
Rn?) - h(r) * RoG) - Rot) : R*x(r)
;

' Rn{r) - 1n'l"l


rpplied to the input of an LfI system with impulse

h(t) - e-b'u(t)
Find the autocorrelation of the output f(r) of the sy$tem.

Using Eq. (1.53), we see that the frequency response l(i,,,) of the system is

H(,o) - t/' t/r(r)l : -+


Ja+D
-1
So lH('o)l':d'+b
2: n
Using Eq. (1.55), we see that the power spectral density of X(r) is
2a
So@)- av [R r(")] -A;*;
By Eq. (7.53), the power spectral density of Y(r) is
Srr(r) : lH(a)12 So@)
Rondom Processes a
,
?81
.)f
:( r 2aA )

l",au')1.;;)
aA Izt l
_l
\ _
a (zo |
_
'l
|

(o'' - h')blot + b') o' - b' lr' + r')


Taking the inverse Fourier transform of both sides of the above equation and using Eq.(1.55),
we obtain

RwG\ : - ?+ e-a'l
G!0, *e-bt'
7 72 Yervry Eq (7,38); that is, the power *.il;;lT, wss process x(/) is real and

for every a.,.

The realness of the power spectrum of X(t) was shown in Solved Problern 7.9. Consider an ideal
bandpass filterwith frequency response (Fig. 7. 12)

H(a): {1 a'.<l"tl{ o'


[0 otherw'ise
with a random process X(t) as its input. From Eq. (7.53)
it follows that the power spectrum Srr(r) of the result-
ing output Y(t) equals

srr(') : {s"t')
a'..lal{ a' -c2 *Q1 0 w1 d2 a
L0 otherwise Fig. 7.72
Hence, from Eq. (7.55), we have

Eyr,s11: t*(a) da: rl*)J],' ,.o(w) da >


* f* o (7. 130)

which indicates that the area of So@) in any interval of c.,,, is nonnegative. This is possible only
if Sr.(r,,,) ) 0 for every a.

7 .23 Consider a WS S process X(r) with autocorrelation Ro (r) and power spectrum Sro (o). Let Xt (t) -

(a) Rs,(r): dW" (7.131)


,dT
&)nx,/(T) --ry (7.132)

(c) &,r,(cu): *}Soqr1 (7.133)

A system with frequency response H(r) ja is a differentiator (Fig. 7.13). Thus, if X(t) is its
-
input, then its output Y(t) X'(t) [see Eq. (1.23)).
-
(a) From Eq. (7.125) '(r)
S*x,(a)
- H(a)Sxr@) - jwSo@) Fig. 7.13 Differentiator
Anolog and Digital Communications

Taking the inverse Fourier transform of both sides, we obtain

Ro,(r) dRoG)
- dr
(b) From Eq. (7.126)
S*' x'(u) : H* (a)S xr'(') - - j wS o'(w)
Again taking the inverse Fourier transform of both sides and using the result of part (a), we have

dr - -d2Ro-U)
R*'*'(') - -dR*-'(r) drz
(c) From Eq. (7.53)
S*,*,(u) : lH(a;)l2Sx*@) :1ial2Sry(r) - ,'So1r1

?.24 Suppose tn-at the input to the differentiator of Fi g.7 .13 is &e zero-mean raxdom telegr,4ph signal
of Solved Problem 7.11.
(a) Determine the power spectrum of the differentiator output and plot it.
(b) Determine the rnean-square value of the differentiator output.
(a) From Eq (7 103) of Solved Problem 7.11.
,4a
S.,,*-t*t)-A-
I \"/
o') + (2a)' .

For the differentiator H(u) : ja, and from


Eq. (7.53), we have
Srr(r) : lH(u)12

. srr( a) : a, -igL (7.134)


ut" * (2a)"
which is plotted in Fig. 7.14.
(b) From Eq. (7.55) or Fig. 7.14
ELY2O)I: t*(a)dw: oo
* f*
7.25 Suppose the randorn telegraph signal of Solved Problem 7.11 is ths iuputto an id+al ban ass
filter with unit gain and rrarrow bandwidth W, (= ?trBj (<< a:,) eentered,*t wa=,2a. Find the
dc component and flre arrerags pow,er of the output.

From Eqs (7.53) and (7.103) and Fig. 7.8(b), the


resulting output power spectrum
Sr.(rr) - lH(w)l2S*x@)
is shown in Fig. 7.15. Since H(0) : 0, from
Eq. (7.50) we see that
-/.!

ltr:1(0)p, - a

Hence, the dc component of the output is zero.


--*l
WB
l'- -t F-
wB
From Eq. (7.103) (Solved Problem 7.1 1) J Fig. 7.15
Random Processes I Lr-
,

(2r)' * (2a)2
A2
(2o')
Since Wn 11 u)s,

Syy(rr,,)
lut
lL'2 - u laYz

The average output power is


-
If otherwise

ntf @) -- L*l: sn(w)du

*,"rl*): A,Wu
2ra
A2B
a
(7.r3s)

?;lS SUFpb,$e,,t ,a WS$, process X(r) with power spectnrm &r(r) is the input to the filter
shourn in Fig. 7.16. find the power spectrum of the output process I(r).
From Fig. 7 .16, Y(t) can be expressed as
Y(t)-x(t)-x(t-r) (7.136)
From Eq. (2.5) the impulse response of the filter is
h(t)-6(t)-6(t-r)
and by Eqs (1.40) and (1.18) the frequency response of
the fllter is
H(a): | - Fig. 7.16
"-iur
Thus, by Eq. (7.53) the output power spectrum is
Sn@): lH(u)12 so@): 11 - ,-i'rf so(w)
: [(l - cos ,rT)2 * sin2 wT)So@) (7.137)
: 2(l - cos wT)So@)

7J7 Suppose that X(l) is the input to an LII system with impulse_ response hr\r) and that fJrJ i1
the input to another LTI sy$tem with impulse response hr(t).It is assumed that X(t) and Y(t)
are jointly wide-sense stationary. lx* Y(t) and Z(t) denote the random process at the respective
I syste ,eutp-nis (FiS. 7.,-t7), Find the cross-correlation and cross spectral density of V(t) and Z(t)
in terms of the cross.co*etation and cross spectral density of X(t\ and I(r).

Using Eq. Q.a7), we have


Rrr(tp tz)
- EIV(tr)Z(tr))
[n-
: ,lf:xut- a\ht@)oo
r".-
[* YGz- p)h@ dp)
I
**@-a
Fig. 7.77
and Di gitol Com m unicotions

: a)Y(tr_ illdadg (7'138)


I:- f* 'rt")h2/)Elx(t,-
- I:* Il* - o' t' - B)dadlj
"r')h2(ilRxY(t'
: ,,<*)hz$)RnUz: tr * a - C)dadg
[: I]
since X(t) and Y(r) arejointly WSS.
Equation (7. 138) indicates that Rr,r(tr, tr) depends only on the time difference r: t2 - r,. Thus
Rnr(r) : I: Il_ ,,f*)hr(g)Rn(r * a - ildadg Q.t3s)
Taking the Fourier transform of both sides of Eq. (7.139), we obtain

Snr(r) : R,.r(r)e-i*'ctt
[:
' : I: I: Il_ ,,r")h,(ilRn(r * a -
p)e-i"dad{jdr

Letr* a- C: ), orequivalently r: \-a* p.Then

srr(r): [: I: f_ ,,<*)hr(ilRn(\)e-i'tt-a+ il4qdp d^


:
[: hr(a)ei"do f* ,rtB)","Pdp I*_ ^*(^)e-idd^
: H{-u)H2@)Sn@)
u\Luyttr1w)sn@) 0.140)
where Hr(r) and Hr(a) are the frequency responses of the respective systems in Fig. 7 .17 .

Special Classes of Random Processes


7.28 Show that if a gaussian random process is WSS, then it is SSS.
If the gaussian process X(r) is WSS, then

Pi: EW(t,)f : F(: constant) for all r,

and R$(ti, t) : Rxx(ti - t,)


Therefore, in the expression for the joint probability density of Eq. (7.58) and Eqs (7 .59), (7 .60),
and (7.61),
ltr: pz:... _ l.tn: tt -- EIX(I;)] : EIX(I, + c)]
Ci: Co(t,, t) : Rxx(t,, t,) - I,tittl
: Rxx(ti - t,) - l-r2 : CoU, * c, t, I c)
for any c. It then follows that
fx(,lx) :.f4,,* a(x)
for any c. Therefore, X(t) is SSS by Eq. (7. I I ). J
7,29 Let X be an rr-dirienqiohal'gaussian random vector [Eq. (7,56)J viith independent components.
Show that the multivariate gaussain joint density function is given by

(7.141)

i=l
where Fi: ElXtl and 4- var (X,).

The multivariate gaussian density function is given by Eq. (7.58). Since f,. X(t,) are indepen-
dent, we have
-
(z t:
C,,: I
' lo' (7.t42)
t0 i*i
Thus, from Eq. (7.60) the covariance matrix C becomes

ol o o

C- (7.r43)

00"j
It therefore follows that

ldet Clt/2 : oto2 ... on - Ilt:1 o, (7.144)

+o ol
0

and c-r _
o+ o2 (7.t4s)

-l
01 Z
on

Then we can write

(* - rD'c-'(* - tt) : (7.146)


*l=:)'
Substituting Eqs (7.144) and (7.146) into Eq. (7.58), we obtain Eq. (7.141).

7 .sCI LetX, : XJrr)


r,) and Xr: X(tr)be jointly gaussian random variables, each with a zero mean and a
Xz: X(tz'.
,variance o?. Show that the joint bivariate gaussian density function is given by

(7.t47)

where f 1s ttre correlation coefficient of X, and Xrgiven by p * Cnl; tEq. (6.83)1.


Analog ond Digital Communicotions

Substitutin1 Ctr: Czz: 02 and C', : Czt : po2 into Eq. (7.60), we have

, -l o' o"')
- o,ft
P)

lpo' "') lp 1l

r:: ::o'""- 0",


'r"
u: - , - ol

Since p:0,x-P:x,and
"\r-p'z)L-p r l
,tC-'* o'(l-p') tr,r,I I
t -.'] [''
L-P lllxzl
]

__f_:
- (l p"j @" - 2 Px * *"')
oL - 'x''
Substituting these results in Eq. (7.58), we obtain Eq. (7.147).
Note that Eq. (7 .147) can be obtained from Eq. (6. 107) (Solved Problem 6.28) by setting X -
Xt;Y:Xr, llx: LLy:0, and ox: oy: o.

?.3ITherelationbetweentheinputX(0andoutputI(r)ofadiodeisexpressedas
.(7.t48)

Let X(t)be a zero-mean stationary gaussian random process with autocorrelation


Rxr(r): s-alrl o > 0
Find the output mean py(t), the output autocorrelation Rn(r), and the output power spectral
densityS o@). .
py(t) - EIY(t)l: r1*g1l : RlCr(O) (7.149)
Ro(t,, tr)
- EIY(tr)Y(tr)f : Zl*1tr1*1tr11
Since X(t) and X(t,r) are zero-mean jointly gaussian random variables, by the result of Supple-
mentary Problem 6.18.
ElXz (t,)X', (rr)l - ElXz (t )IEKXL QJI + 2 {ElX(t r)X(t ))}2 (7.1s0)
Since X(r) is stationary
t7x2 (t r)l - Elxz (tz)l : Rrr(0)
and EIX(ttV(tr)): RoQr- tr) : RoG)
Hence, Ro(tr, tz) : Ro(i - [Rs(o)]' + 2lRo(r)12 (7.151)

and using Eqs (1.41) and (1.29), we have

Sir(o) - tr fRw(r)l - 2trlRo(0)l'5(r) + 1 So(r) * S.,o(r) (7.1s2)


7l

Now, for the given input autocorrelation, by Eqs (7 .L4g) and (7 .15 1),

tty(t): R_ry(0) : 1
and Rrr(r)-1-F2u-2atrt
By using Eqs (1.41) and (1.55), the output power spectral density is

Srr(r) - tV- lRwi)l: 2tr5(a) .


#^J
7.32 The ioput X(r) to ttte ftC filter shown in Fig. 7.18 is a white noise process.
(a) Determine the power spectrum of the output process Y(l).
(b) Determine the autocorrelation and the mean-square value of I(4.
From Solved Problem 2.6 the fiequency response of the RC
filter is
H(*'l
|+ j'rRC
(a) From Eqs (7.62) and (7.53)

, sxi.(*,) _ +
2
Fig. 7.18 RC filter
lq
Srr(r) - lH(a)12 Ss(co) - (7. ls3)
r+ @nc)z z
(b) Rewriting eq. (7.153) as

syr(a)-+ | -ZU/(RC)) -

and using the Fourie. ;r;; ^ r:r:


::,l r;:i:l;*.
Ryy(r): ? + r'it/(RC) (7.1s4)
2 2RC
Finally, from Eq. (7 )5a)
E[Y21r1]- Rrr(O) : It
(7. l ss)
4RC

7.33 The inputX(0 to an idEal bandpass filter having the frequency response characteristic shown in
Fig. 7.19 is a white noise process. Determine the total noise power at the output of the filter.

sp*(u) - !2
S"r(r) - 1H(w)l2S."v@)

-'t2 1upy1'
The total noise power at the output of the filter is

L \ /J a [- srr(a)dw:
rtfGlj- 'r'
I !- f* fi1u)l2da
I

2trJ-n 21 2J -*

= 2+ ewil:qB
2 ztr
where B : Wal(2n) (in Hz).
Analog and Digital Communications

7.34 The equivalent noise bandwidth of a system is defined as

B.q: t IJ wt'll' a' ,, (7.rs7)


2n lV(r)11*^*
where lH(u)i12**
- max},F(cu)|2.
(a) Determine the equivalent noise bandwidth of the ideal bandpass filter shown in Fig. 7-"19; .
(b) Determino the equivatrent noise bandwidth of the low-pass JtC filter shown in Fig. 7.18, and
compare the result with the 3-dB bandwidth obtained in Solved Problem 2.12.

(a) For the ideal bandpass filter shown in Fig. 7.19, we have maxll(u.')|2 : I and

B.o I
lH(,r)lrdr-* :tsHz (7.158)
*[r-
(b) For the low-pass RC filter shown in Fig. 7.18 we have

lH(r)l'---=
| + (w RC)z

and maxlH(a112: ll(0)12 : I


1 r*
8", GJ, I I fc*,
d"s :znzJ-* du
Thus
t+@RCf | + (a RC)z

(7.l se)
-lH,
4RC
.From Solved Problem 2.12 the 3-dB bandwidth of the low-pass,RC filter is

Brou:
*wror: fi, ^,
Thus, 8", a Btoe = I .57 83 dB

7.35 A narro'q/band white noise X(r) has the power spectrum shown in Fig. 7.20(a). RepresentX(| in
terms of quadrature components. Derive the power spectra ofxr(D andXlt), and show tha(
Elxl@l * sLXzdt)l - olxz{t)l (7.160)

From Eq. (7.67)


X(t)- X,(t) cos a,t - X,(t) sin w,t
From Eq. (7.70) and Fig.7.20(a), we have
< W (: 2r B)
- In
lr.rl
s*rr"(r): sr*"r(r)
Io otherwise

which is plotted in Fig. 7.20(b). From Fig. 7 .24(a)


Random Processes

nl*o)- I vf- su@)da


ZTf -xt

: *Q) [",'.; Zn'


I
- 2n
n2W:2qB
From Fig. 7.20(b) Sx"x"(r): S44(o)

Elx2"Q)l - nRG\: *ot [o* rt da


: : q(2IY) :2rlB
I
WO W
lTt (b)

Hence, tig. 7.2O

ElX2(01 : zyxlltyl : E[X2,(t)) : I


LryQl4r1:2rg (7. 161)
2r

7.1 Consider a random process X(r) defined by 7.4 Let X(t) be a zero-mean WSS random pro-
X(t) : cos f)r cess with autocorrelation Rxr?) A random
where Q is a random variable uniformly variable )'is formed by integrating X(t):
distribuled over [0, oo]. Determine whether f -= I s' u,,r,
X(t) is stationary i J-rx(r)dt
lAns. Nonstationary] Find the mean and variance of }.
lHint: Examine specific sample
functions of X(t) for different frequencies, tty : o: of; : I f" n....(r)1,
"x)(\' - +)rrl
lorr.
L TJo
say, Q - rl2, r, and2tr.) 7.5 A sample function of a random telegraph
7.2 Consider the random process X(r) defined by
signal X(r) is shown in Fig. 7.21. This signal
X(t): Acos ut makes independent random shifts between
where c,,, is a constant and A is a random vari-
two equally likely values, I and 0. The
able uniformly distributed over [0, 1]. Find the
number of shifts per unit time is governed by
autocorrelation and autocovariance of X(t).
the Poisson distribution with parameter a.
lAns. Ro(tr, - 1.o,
3
tz) rr cos /2 (a) Find the autocorrelation and the power
spectrum of X(t).
Cyyg1, t) : l.o, /, cos /r]
t2
7.3 Let X(t) be a WSS random process with
autocorrelation
RoG) - Ae-ol"l
Find the second moment of the random vari-
able )z x(5) - X(2).
-
l,4nt.2A(l - "-3")l
Fig. 7.27
.l 34 |
7 Anolog ond Digital Communications

(b) Find the rms value of X(t).

Ans.(a) Rxxo) -
{, * "-2atrt);
sxx (w) -
{ *upl + }#S
A2

b\!2
7.6 Suppose that X(t) is a gaussian process with noise.Its power spectrum S,rr(a) is practi-
ltx: 2Ro(r) cally constant for f < 1012 Hz and is given
- 5e4'21'1
by
Find the probability that X(4) < 1.
Snr(a) :2kTR
[Ans.0.159]
where k Boltzmann constan t - I.37 (10-"),
7.7 The output of a filter is given by -
joules per kelvin (J/K)
Y(t) x(t + r) - X(t - T)
- I * absolute temperature, kelvins (K)
where X(l) is a WSS process with power R resistance, ohms (0)
-
spectrum Sxx(a) and I is a constant. Find Calculate the thermal noise voltage (rms
the 1.rwer sfectrum of f(r). value) across the simple RC circuit shown
lAns. Syr(r) - 4sin2 wTS,u(w)l in Fig. 7.22withR : I kilohm (kf)), C: I
7.8 Let i (r) is the Hilbert transform of a WSS microfarad (pF), at T 27"C.
-
process X(r). Show that
l,4ns./.,,,,: =0.2 ptY.l
Rri,(0) - E[X(t\ *(,)] : 0 E
lHint: Use relation (b) of Solved Problem Noiseless
7.20 and definition (2.26).1 R

7.9 When a metallic resistor R is at temperature =


Sw@)
Z, random electron motion produces a noise
voluge V(t) a:the open-circuited terminals.
'fhis voltage V(t) is known as the thennal tig. 7.2?

You might also like