Download as pdf or txt
Download as pdf or txt
You are on page 1of 53

Principles of Communications

Dr. Madhubanti Maitra


Electrical Engineering Department,
Jadavpur University,
Kolkata.

1
Chapter 1: Introduction
Q: What is a communication system ?

A: The electronic communication system is


concerned with solving Two types of problems:

(1) Production and transmission of electrical energy.

(2) Transmission or processing of information.

Conclusion: Communication systems are designed


to transmit information.

2
Analog Signal Transmission and Reception

The purpose of a communication system is to


transmit information-bearing waveforms through a
communication channel joining the transmitter to the
receiver.

A large number of such information sources are


analog sources. Speech, image and video are
examples of analog sources of information.

The information bearing waveforms present at the


receiver is unknown until it is received and these
waveforms are often characterized by their
bandwidth, center frequencies and waveform power
and energy, effect of noise etc.
3
Fundamental Diagram of a Communication System

Information
Source & Input Transmitter
Transmitted
Transducer Message Signal
Signal

Channel

Output Receiver
Transducer
Received Signal
Estimate of the
message signal
Output Signal Fig. 1.1 Elements of a
Communication System

User of information
4
Elements of a Communication System: The Transmitter

The heart of a Communication System consists of

The Transmitter:

➢ Converts the electrical signal into a form that is suitable for transmission
through the physical channel or transmission medium.

➢ Transmits signals of specified frequency range by the Telecom Regulatory


Authority of each country .

➢Translates the outgoing information signal into appropriate frequency


range that matches the frequency allocated to the transmitter.

➢Thus signals transmitted by multiple radio stations do not interfere with


one another.
Example: Telephone communication systems where the
electrical speech signals from many users are transmitted
over the same wire.
5
Radio Frequency Spectra
Frequency Designation Abbreviation

30-300Hz Extremely Low Frequency ELF


300-3000Hz Voice Frequency VF
3-30Khz Very Low Frequency VLF
30-300Khz Low Frequency LF
300KHz-3Mhz Medium Frequency MF
3-30MHz High Frequency HF
30-300MHz Very High Frequency VHF
300MHz-3GHz Ultra High Frequency UHF
3-30 GHz Super High frequency SHF
30-300GHz Extra High Frequency EHF

Note: Communication Systems are often categorized by the frequency of the


carrier. 6
Elements of Communication System: The Channel
The communication channel is the physical medium that is used to
send the signal from the transmitter to the receiver.

In wireless transmission, the channel is usually the atmosphere (free


space). On the other hand, telephone channels usually employ a
variety of physical media, including wirelines, fiber optic cables and
wireless (microwave radio) .

The physical channel may be a pair of wires that carry the electrical
signal, or an optical fiber that carries the information on a modulated
light beam, or an underwater ocean channel in which the information
is transmitted acoustically, or free space over which the information
bearing signal is radiated by use of antennas.

Whatever be the physical medium for signal transmission, the


essential feature is that the transmitted signal is corrupted in a
random manner by a variety of mechanisms.
7
Introduction to the Channel & Noise

One common problem in signal transmission through any channel is


additive noise.

In general, additive noise is generated internally by components,


such as resistors and solid-state devices, used to implement the
communication system.

The most common form of signal degradation comes in the form of


additive noise, which is generated at the front end of the receiver
where signal amplification is performed.

This noise is often called the thermal noise. In wireless transmission,


additional additive disturbances are man-made noise and
atmospheric noise picked up by a receiving antenna.
8
Channel & Noise: Examples

Automobile ignition noise (man-made)

Electrical lightning discharges from thunderstorms (Atmospheric


noise)

Interference from other users of the channel……

The additive signal distortions are usually characterized as


random phenomena and described in statistical terms. The
effect of this distortion must be considered in the design of
the communication systems..
9
Elements of Communication Systems: The Receiver

The function of the receiver is to recover the message signal contained in


the received signal.

If the message signal is carrier transmitted by carrier modulation, the


receiver performs demodulation to extract the message from the
sinusoidal carrier.

Since the signal demodulation is performed in the presence of additive


noise and possibly other signal distortions, the demodulated message
signal is generally degraded to some extent by the presence of these
distortions in the received signal.

Besides performing the primary function of signal demodulation, the


receiver also performs a number of peripheral functions, including signal
filtering and noise suppression.

10
Low Frequency
Information Modulated Signal
(Intelligence
Modulated
Amplifier
Stage

High Frequency
Carrier

Transmitter
Transmitting Medium

Amplifier Demodulator Amplifier Output


(Detector) Transducer

Intelligence Signal

Receiver Fig. 1.2 A Detailed Block Diagram 11


So…What is Modulation and Demodulation?
Modulation is the process of putting information onto a
high frequency carrier for transmission.

In essence, the transmission takes place at the high


frequency (the carrier) which has been modified to “carry”
the lower frequency information.

The low-frequency information is often called


“intelligence signal” or simply “Intelligence”.

It follows that once this information is received, the


intelligence must be removed from the high frequency
carrier—a process known as demodulation.
12
Basics of Analog Modulation (Contnd..)
Modulation helps propagating of the low-frequency signal with a
high-frequency carrier.

There are basic three methods of putting low-frequency information


onto a higher frequency. Refer equation 1.1, which represents a sine
wave, which we assume to be the high frequency carrier,

ν= Vp sin (ωt+φ) (1.1)


ν is the instantaneous value
ω = 2 π f = the angular frequency
Vp is the peak value
φ is the phase angle

Any one of the last three terms could be varied in accordance with
the low-frequency information signal to produce a modulated signal
that contains the intelligence. If the amplitude term, Vp is varied, it is
called amplitude modulation (AM). If the frequency is varied, it is
frequency modulation (FM). Varying the phase angle φ, results in
phase modulation (PM).

13
Some commonly used yet confusing jargons

The higher frequency carrier signal is often referred simply as


carrier.

If the frequency is high enough, in some application, it is also


termed as Radio Frequency (RF) as that can be transmitted through
the free space as a radio wave.

The low frequency intelligence or message signal is also termed as


“Modulating Signal”, Information Signal or “Modulating Wave”.

14
Analog Signal Transmission and Reception

There are two basic modes of communication:

➢ Broadcasting: Involves the use of a single


powerful transmitter and numerous receivers–
relatively inexpensive to build.

➢ Point-to-Point : Takes place over a link between


a single transmitter and a receiver.

15
Analog Signal Transmission and Reception
Analog signal transmission

Baseband communication Carrier communication

It does not use modulation. It makes use of modulation.

Baseband refers to the band of frequencies representing the


original signal as delivered by a source of information. The
baseband in Telephony is the audio band with the range: 0 –
3.5 kHz and in Television is the video band with the range: 0
– 4.3 MHz. 16
Analog Signal Transmission and Reception: Features

➢ In baseband communication, signals are transmitted without


any shift in the range of frequencies of the signal. However
baseband signals produced by various information sources are
not always suitable for direct transmission.

➢ Modulation causes a shift in the range of frequencies. This is


called carrier communication.

➢ A carrier is generally denoted by c(t) and one of its parameters


i.e. amplitude, frequency, or phase, is varied in proportion to the
baseband signal m(t).
17
Analog Signal Transmission and Reception
Analog modulation techniques

Amplitude Modulation (AM) Angle Modulation

Amplitude of c(t) is varied with Instantaneous phase or frequency


m(t). of c(t) is varied with m(t).

When carrier frequency is modulated, it is called Frequency


Modulation (FM). When carrier phase is modulated, it is
called Phase Modulation (PM).
18
Need for Modulation : Revisited

To translate the low-pass signal in frequency to the passband of the


channel so that the spectrum of the transmitted bandpass signal match
the passband characteristics of the channel.

To accommodate for simultaneous transmission of signals from


several message sources using frequency-division multiplexing
(FDM).

To expand the bandwidth of the transmitted signal in order to


increase its noise-immunity in transmission over a noisy channel.

What is Demodulation ??
At the receiving end of the system, the original bandpass signal is
restored by performing a process called demodulation.
Demodulation can be viewed as the reverse of the modulation
19
process.
The Modulation Process

For all modulation processes analog signal m(t) is considered as a low-


pass signal of bandwidth W i.e. M(f)  0, for |f|> W. The signal is assumed
as a power signal with power Pm:

T m(t ) dt
1 2 2
Pm = lim
T →
T−
2

This message signal is carried through the communication channel by


impressing it on a carrier signal c(t):

c(t ) = Ac cos(2f c t +  c )
Ac: carrier amplitude, fc: carrier frequency, c: carrier phase.
20
Review of Fourier Transform
➢ The Fourier integral transform pair can be given as:

G ( f ) =  g (t )e − j 2 ft dt
−

g (t ) =  G ( f )e j 2 ft df
−

where f = frequency measured in Hz. This pair can be used to describe the
time-frequency relationship for non-periodic signals. The relationship
between the time and frequency domains is indicated by the double
arrow, given as:
g (t )  G ( f )
➢ G(f) is specified by a magnitude characteristic and phase
characteristic:
G ( f ) = G ( f ) e j ( f ) 21
Review of Fourier Transform
Properties of the Fourier transform
Property Mathematical Description

22
Review of Fourier Transform
Properties of the Fourier transform
Property Mathematical Description

23
Table of Fourier Transform Pairs

Time Domain Frequency Domain

24
Table of Fourier Transform Pairs

Time Domain Frequency Domain

25
RANDOM PROCESSES

26
Random Processes
There are several phenomena in the nature which cannot be
modeled deterministically. They can only be modeled in statistical
terms. The notion of a random process is an extension of the
random variable. Example: Temperature (say, x) of a place at a
particular time of every day.

But the temperature is also a function of time. At 5 P.M. the


temperature may have an entirely different distribution from that of
the temperature at noon. Thus random variable x is also a function
of time. A random variable that is a function of time is called
random process or a stochastic process.

To specify a random variable, “x”, we repeat the experiment a


large number of times and from the outcomes we determine the
probability density function, px (x). Similarly, to specify the
random process, x(t), we do the same thing for each value of t.
27
Random Process
➢Hence, in the process, we get a set of waveforms which are
functions of time.

➢The collection of all possible waveforms (corresponding to the


sample space) is called an ensemble of time functions or a
random process.
➢ A waveform in this collection is a sample function (rather than a
sample point).

The set (ensemble) of all possible waveforms (or the family of


the waveforms) of a random process is denoted as X(t,S),
where t is the time index and S represents the set or sample
space of all possible sample functions. A single waveform in
this set is called x(t,s). Usually s or S is dropped from the
notation.
28
Sample Space to Random Process

Random experiment can be described in


terms of a sample space S. The domain of S
is the set of all experimental outcomes and
called a sample space.
The totality of sample points corresponding
to the aggregate of all possible outcomes of
the experiment is called the sample space.

Sample function amplitudes at some


instant t=t1 are the values taken by the
r.v. x(t1) in various trials.
29
Random Variable & Random Process

A random variable x is a rule for assigning to


every outcome s of an experiment a number
x(s).

A random process x(t) is a rule for assigning


to every s a time function x (t, s).

Thus a random process is a family of time


functions depending on the parameter s or
equivalently a function of t and s.

30
Random Processes

An ensemble of sample functions. 31


Random Process
A probability distribution can be defined over a class of sets in
the sample space S. Consequently, the probability of various
events can be determined.

A random experiment, thus, can be specified by the outcomes s


from some sample space S, by the events defined on the sample
space S and by the probabilities of the occurrences of these
events.

To each sample point s, a function of time can be assigned in


accordance with the rule:

X(t, s), -T ≤ t ≤ T

Where 2T is the total observation interval , then


32
Random Processes

For a fixed sample point sj, the graph of the function X(t, sj)
versus time t is called a realization or sample function of the
random process. These sample functions are denoted as:

x j (t ) = X (t , s j ), j = 1,2, , n
33
Random Processes

For a fixed time tk inside the observation interval, the set of


numbers

x (t ), x (t ), , x (t ) = X (t
1 k 2 k n k k
, s1 ), X (t k , s2 ), , X (t k , sn )
constitutes an RV. This gives an indexed ensemble (family) of
RVs {X(t,s)}, called a random process.
34
Random Processes
Let us consider a random process X(t) at different time
instants t1 > t2 >…> tn, n is a positive integer. In general, the
samples Xti  x(ti), i = 1,2,…,n are n random variables
statistically characterized by their joint probability density
function (PDF) denoted as p(xt1, xt2,…, xtn) for any n.

Hence, formally, a random process X(t) can be defined as


an ensemble of time functions together with a probability
rule that assigns a probability to any meaningful event
associated with an observation of one of the sample
functions of the random process.

35
Stationary Random Processes
Let us consider that we have n samples of the random
process X(t) at t = ti, i = 1,2,…,n, and another set of n samples
displaced in time from the first set by , as, Xti+   X(ti+ ), i =
1,2,…,n.

If the joint PDFs of the two sets of random variables are


identical i.e. p(xt1, xt2,…, xtn) = p(xt1+ , xt2 + ,…, xtn + ) for all 
and n, the random process is called stationary in the strict
sense.

The statistical properties of a stationary random process


are invariant to a translation of the time axis.
36
Mean or Expectation of a Random Process

Definition:
The mean or expectation of the random process X(t)
is a deterministic function of time denoted by mx(t),
that at each time instant t0 equals the mean of the
random variable X(t0). That is mx(t) = E[X(t)], for all t.
Since, at any t0 , the random variable X(t0) is well
defined with a probability density function f x(t0)(x),
then


E[ X (t 0)] = mx (t 0) =  t0
−
xf x ( x ) dx
37
Autocorrelation function
Autocorrelation function completely
describes the power spectral density and the
power content of a large classes of random
process.

Definitions:
Autocorrelation function of the random
process X(t), denoted by Rx(t1, t2) is defined
by:
Rx(t1, t2) = E[x(t1) x(t2)]

38
Autocorrelation Function
Let a random process X(t) be sampled at t = ti. Then X(ti) is
a random variable with PDF p(xti). For a stationary process,
p(xti+ ) = p(xti) for all . Hence the PDF is independent of time.

Let us consider two random variables Xti = X(ti), i = 1,2,


corresponding to samples of X(t) taken at t = t1 and t = t2. The
autocorrelation function of the random process X(t) is
measured by the expectation of the product of the two
random variables Xt1and Xt2. Mathematically speaking,

RXX (t1 ,t 2 ) = RX (t1 ,t 2 ) = E ( X t X t ) =   xt xt p (xt , xt )dxt dxt


 

1 2 1 2 1 2 1 2
− −

39
Autocorrelation Function
For a stationary process X(t), the joint PDF of the pair
(Xt1,Xt2) is identical to the joint PDF of the pair (Xt1+,Xt2 +), for
any arbitrary . Hence the autocorrelation function of X(t)
depends on the time difference, t1- t2 = .

For a stationary, real-valued random process, the


autocorrelation function is:
RXX (t1 ,t 2 ) = RXX (t1 − t 2 ) = RXX () = E ( X t +  X t
1 1
)
On the other hand:

1 1
( 1 1
/
)
RXX (−  ) = E ( X t − X t ) = E X t X t +  = RXX ( )
/

RXX (0) = E (X t2 ) = average power of the random process


1 40
Autocorrelation Function

The physical significance of the autocorrelation function is that it


provides a means of describing the interdependence of two random
variables obtained by observing a random process X(t) at times  seconds
apart.

The more rapidly the random process X(t) changes with time, the more
rapidly will the autocorrelation function RX () decrease from its maximum
RX (0) as  increases.
41
Crosscorrelation Function
The crosscorrelation function of two random processes X(t) and Y(t) is
defined by the joint moment as:
 
( )  x
RXY (t1 , t 2 ) = E X t1Yt2 = t1 ( )
yt2 p xt1 , yt2 dxt1 dyt2
− − 

The random processes X(t) and Y(t) are said to be statistically


independent iff:

( 1 2 n 1
/ /
2
/
m
)
p xt , xt , , xt , yt , yt , , yt = p (xt , xt , , xt ) p yt , yt , , yt
1 2 n
( 1
/ /
2
/
m
)
for all choices of ti, ti/ and for all positive integers n and m.

The random processes X(t) and Y(t) are said to be uncorrelated if:

( )( )
RXY (t1 , t 2 ) = E X t1 E Yt2
42
The Wiener-Khintchine Theorem
For stationary random process, we use a very useful
theorem that relates the power spectrum of a random
process to its autocorrelation function.

The Wiener-Khintchine Theorem:

For a random process X(t), the power spectral


density is the Fourier Transform of the
autocorrelation function.

Mathematically, this can be represented as

Sx (f) = ₣ [Rx()]
43
Power Spectral Density

Definition # 1…..
The power spectral density (PSD) or power spectrum of
a stationary process X(t) is defined as:


S XX ( f ) = S X ( f ) =  RXX ( )e − j 2f
d
−

44
Power Spectral Density
In signal theory, spectra are associated with Fourier Transforms. For
deterministic signals, they are used to represent a function as a super
position of exponentials. For random signals, the notion of the spectrum
involves transform of averages: it is thus essentially deterministic.

Definitions # 2….

The power spectrum (or spectral density) of a stationary process X(t) is


the Fourier Transform S(ω) of its autocorrelation R( ) = E[X(t+ )X(t)].

Thus, the power spectral density S(ω) and the autocorrelation function
R () of a stationary process X(t) form a Fourier transform pair:


S XX ( f ) =  R XX ( )e − j 2f d
−

R XX ( ) =  S XX ( f )e j 2f df
−
45
Power Spectral Density (PSD)
Properties …

The zero-frequency value of the PSD of a stationary


process equals the total area under the graph of the
autocorrelation function.
S XX (0 ) = − RXX ( )d

The average power of a stationary process equals the total


area under the graph of the PSD.
E ( X t ) = RXX (0 ) = − S XX ( f )df
2 

The PSD of a stationary process is always nonnegative and


the PSD of a real-valued random process is an even function
of frequency.
S XX (− f ) = S XX ( f ) 46
Power Spectral Density
Conclusions…..
The power content or simply the power of a stationary
random process is the sum of the powers at all frequencies
in the random process.

In order to find the total power, we have to integrate the


power spectral density over all frequencies.

Power in a stationary random process can be defined either by


integrating its power spectral density (adding all power
components) or substituting  = 0 in the autocorrelation
function of the process.
47
Noise in Communication Systems
There are many potential sources of noise (external or
internal) in a communication system.

The external sources of noise include atmospheric noise,


man-made noise etc.

The internal sources of noise include spontaneous


fluctuations of currents or voltage in electrical circuits. Two
most common examples are shot noise and thermal noise.

Shot noise arises in electronic devices e.g. diodes and


transistors, because of the discrete nature of current flow in
these devices.

Thermal noise arises from the random motion of electrons


in a conductor. 48
White Noise
The noise analysis of communication systems is customarily
based on an idealized form of noise called white noise. The
PSD of white noise is independent of operating frequency.

Characteristics of white noise. (a) Power spectral density and


(b) Autocorrelation function.
49
White Noise

The PSD of N0 is in watts


white noise: per Hertz.

The parameter N0 is usually referred to the input stage of the


receiver of a communication system. It is given as N0 = kTe
where k = Boltzmann’s constant and Te = the equivalent noise
temperature of the receiver. 50
Equivalent Noise Temperature

The equivalent noise temperature of a system


is defined as the temperature at which a noisy
resistor has to be maintained such that, by
connecting the resistor to the input of a
noiseless version of the system, it produces
the same available noise power at the output of
the system as that produced by all the sources
of noise in the actual system.

51
White Noise

Strictly speaking, white noise has infinite average power


and, as such, it is not physically realizable.

As long as the bandwidth of a noise process at the input of a


system is appreciably larger than that of the system itself,
then the noise process may be modeled as a white noise.

52
References
✓ John G. Proakis and Masoud Salehi, Communication Systems
Engineering. 2nd Edition, Pearson Education, 2008.

✓ Simon Haykin, Communication Systems. 4th Edition, Wiley India


Edition, 2008.

✓ B. P. Lathi, Modern Digital and Analog Communication Systems.


3rd Edition, Oxford University Press, 2000.

✓A. Papulis and S. Unnikrishna Pillai, Probability, Random


Variables and Stochastic Processes, 4th Edition, Tata McGraw-Hill,
2002

✓Jeffrey S. Beasley and Gary M. Miller, Modern Electronic


Communication, 8th Edition, Prentice-Hall of India, 2006. 53

You might also like