Professional Documents
Culture Documents
Ch_1_Introduction_MM_2021
Ch_1_Introduction_MM_2021
1
Chapter 1: Introduction
Q: What is a communication system ?
2
Analog Signal Transmission and Reception
Information
Source & Input Transmitter
Transmitted
Transducer Message Signal
Signal
Channel
Output Receiver
Transducer
Received Signal
Estimate of the
message signal
Output Signal Fig. 1.1 Elements of a
Communication System
User of information
4
Elements of a Communication System: The Transmitter
The Transmitter:
➢ Converts the electrical signal into a form that is suitable for transmission
through the physical channel or transmission medium.
The physical channel may be a pair of wires that carry the electrical
signal, or an optical fiber that carries the information on a modulated
light beam, or an underwater ocean channel in which the information
is transmitted acoustically, or free space over which the information
bearing signal is radiated by use of antennas.
10
Low Frequency
Information Modulated Signal
(Intelligence
Modulated
Amplifier
Stage
High Frequency
Carrier
Transmitter
Transmitting Medium
Intelligence Signal
Any one of the last three terms could be varied in accordance with
the low-frequency information signal to produce a modulated signal
that contains the intelligence. If the amplitude term, Vp is varied, it is
called amplitude modulation (AM). If the frequency is varied, it is
frequency modulation (FM). Varying the phase angle φ, results in
phase modulation (PM).
13
Some commonly used yet confusing jargons
14
Analog Signal Transmission and Reception
15
Analog Signal Transmission and Reception
Analog signal transmission
What is Demodulation ??
At the receiving end of the system, the original bandpass signal is
restored by performing a process called demodulation.
Demodulation can be viewed as the reverse of the modulation
19
process.
The Modulation Process
T m(t ) dt
1 2 2
Pm = lim
T →
T−
2
c(t ) = Ac cos(2f c t + c )
Ac: carrier amplitude, fc: carrier frequency, c: carrier phase.
20
Review of Fourier Transform
➢ The Fourier integral transform pair can be given as:
G ( f ) = g (t )e − j 2 ft dt
−
g (t ) = G ( f )e j 2 ft df
−
where f = frequency measured in Hz. This pair can be used to describe the
time-frequency relationship for non-periodic signals. The relationship
between the time and frequency domains is indicated by the double
arrow, given as:
g (t ) G ( f )
➢ G(f) is specified by a magnitude characteristic and phase
characteristic:
G ( f ) = G ( f ) e j ( f ) 21
Review of Fourier Transform
Properties of the Fourier transform
Property Mathematical Description
22
Review of Fourier Transform
Properties of the Fourier transform
Property Mathematical Description
23
Table of Fourier Transform Pairs
24
Table of Fourier Transform Pairs
25
RANDOM PROCESSES
26
Random Processes
There are several phenomena in the nature which cannot be
modeled deterministically. They can only be modeled in statistical
terms. The notion of a random process is an extension of the
random variable. Example: Temperature (say, x) of a place at a
particular time of every day.
30
Random Processes
X(t, s), -T ≤ t ≤ T
For a fixed sample point sj, the graph of the function X(t, sj)
versus time t is called a realization or sample function of the
random process. These sample functions are denoted as:
x j (t ) = X (t , s j ), j = 1,2, , n
33
Random Processes
x (t ), x (t ), , x (t ) = X (t
1 k 2 k n k k
, s1 ), X (t k , s2 ), , X (t k , sn )
constitutes an RV. This gives an indexed ensemble (family) of
RVs {X(t,s)}, called a random process.
34
Random Processes
Let us consider a random process X(t) at different time
instants t1 > t2 >…> tn, n is a positive integer. In general, the
samples Xti x(ti), i = 1,2,…,n are n random variables
statistically characterized by their joint probability density
function (PDF) denoted as p(xt1, xt2,…, xtn) for any n.
35
Stationary Random Processes
Let us consider that we have n samples of the random
process X(t) at t = ti, i = 1,2,…,n, and another set of n samples
displaced in time from the first set by , as, Xti+ X(ti+ ), i =
1,2,…,n.
Definition:
The mean or expectation of the random process X(t)
is a deterministic function of time denoted by mx(t),
that at each time instant t0 equals the mean of the
random variable X(t0). That is mx(t) = E[X(t)], for all t.
Since, at any t0 , the random variable X(t0) is well
defined with a probability density function f x(t0)(x),
then
E[ X (t 0)] = mx (t 0) = t0
−
xf x ( x ) dx
37
Autocorrelation function
Autocorrelation function completely
describes the power spectral density and the
power content of a large classes of random
process.
Definitions:
Autocorrelation function of the random
process X(t), denoted by Rx(t1, t2) is defined
by:
Rx(t1, t2) = E[x(t1) x(t2)]
38
Autocorrelation Function
Let a random process X(t) be sampled at t = ti. Then X(ti) is
a random variable with PDF p(xti). For a stationary process,
p(xti+ ) = p(xti) for all . Hence the PDF is independent of time.
1 2 1 2 1 2 1 2
− −
39
Autocorrelation Function
For a stationary process X(t), the joint PDF of the pair
(Xt1,Xt2) is identical to the joint PDF of the pair (Xt1+,Xt2 +), for
any arbitrary . Hence the autocorrelation function of X(t)
depends on the time difference, t1- t2 = .
1 1
( 1 1
/
)
RXX (− ) = E ( X t − X t ) = E X t X t + = RXX ( )
/
The more rapidly the random process X(t) changes with time, the more
rapidly will the autocorrelation function RX () decrease from its maximum
RX (0) as increases.
41
Crosscorrelation Function
The crosscorrelation function of two random processes X(t) and Y(t) is
defined by the joint moment as:
( ) x
RXY (t1 , t 2 ) = E X t1Yt2 = t1 ( )
yt2 p xt1 , yt2 dxt1 dyt2
− −
( 1 2 n 1
/ /
2
/
m
)
p xt , xt , , xt , yt , yt , , yt = p (xt , xt , , xt ) p yt , yt , , yt
1 2 n
( 1
/ /
2
/
m
)
for all choices of ti, ti/ and for all positive integers n and m.
The random processes X(t) and Y(t) are said to be uncorrelated if:
( )( )
RXY (t1 , t 2 ) = E X t1 E Yt2
42
The Wiener-Khintchine Theorem
For stationary random process, we use a very useful
theorem that relates the power spectrum of a random
process to its autocorrelation function.
Sx (f) = ₣ [Rx()]
43
Power Spectral Density
Definition # 1…..
The power spectral density (PSD) or power spectrum of
a stationary process X(t) is defined as:
S XX ( f ) = S X ( f ) = RXX ( )e − j 2f
d
−
44
Power Spectral Density
In signal theory, spectra are associated with Fourier Transforms. For
deterministic signals, they are used to represent a function as a super
position of exponentials. For random signals, the notion of the spectrum
involves transform of averages: it is thus essentially deterministic.
Definitions # 2….
Thus, the power spectral density S(ω) and the autocorrelation function
R () of a stationary process X(t) form a Fourier transform pair:
S XX ( f ) = R XX ( )e − j 2f d
−
R XX ( ) = S XX ( f )e j 2f df
−
45
Power Spectral Density (PSD)
Properties …
51
White Noise
52
References
✓ John G. Proakis and Masoud Salehi, Communication Systems
Engineering. 2nd Edition, Pearson Education, 2008.