Analog Digital Communication:: Modulation, Demodulation and Coding

You might also like

Download as pdf or txt
Download as pdf or txt
You are on page 1of 16

Analog Digital Communication :

Lec2_chpater01

Asad Abbas Assistant Professor Telecom Department Air University, E-9, Islamabad

Scope of the course ...


General structure of a communication systems
Info. Noise Received Transmitted Received info. signal signal Transmitter Receiver Channel User

SOURCE

Source

Transmitter
Formatter Source encoder Channel encoder Modulator

Receiver
Formatter
10/29/2009

Source decoder

Channel decoder
Lecture 1

Demodulator

Random Variable

The random variable, X(A) represents a functional relationship between random event and a real number. It is designated by X and functional dependence upon A is considered as implicit.

Discrete Random Variable

If in any finite interval X assumes only finite number of distinct values, it is discrete random variable, for example tossing of dice, tossing of a coin If X assumes any value within interval it is continuous, for example noise, temperature etc

Continuous Random Variable

10/29/2009

Lecture 1

Distribution

Cumulative Distribution (CD): Fx (x) = P ( X < x) It is the probability that the value taken by the random variable is less than or equal to real number Properties

10/29/2009

Lecture 1

Distribution (contd..)

Probability Density Distribution (PDF) px(x) = d/dx (Fx (x)) It is rate of change of CD with respect to the random variable

FX ( x) = p X ( x)dx

Properties

10/29/2009

Lecture 1

Ensemble Averages of Random Variable


Mean of a Discreet RV = E{ X } =

X = xPx( xi )
i =1

Mean = Nth moment of PD of a random variable= Second moment of PD of a random variable= The second central moment is given by

10/29/2009

Lecture 1

Ensemble Averages (contd..)

10/29/2009

Lecture 1

Random Process

Any random variable that is function of time and event is called random process, X(A,t). The random process is a collection of time functions, or signals, corresponding to various outcomes of a random experiment.

For each outcome, there exists a deterministic function, which is called a sample function.
Random variables

Real number

Sample functions or realizations (deterministic function) time (t)

10/29/2009

Lecture 1

Random Process..contd

The random processes at each a point in time is random variable. X(tk) is random variable by observing random process at time at t=k. The values that X(tk) can take are X1(tK)XN(tK) The random processes have all the properties of random variables, such as mean, correlation, variances, etc

10/29/2009

Lecture 1

Statistical Properties of Random process

Mean

Autocorrelation of Random Process X(t)

It measures of the degree to which two time samples of the same random process are related It is function of two variables t1= t and t2= t+

10/29/2009

Lecture 1

10

Random process Types

Stationary Process or Strictly Stationery Process: If all of the statistical properties random process do not change with time it is called strictly stationery, that is: Random process do not depends on time i.e X(A,t) = X(A) Its Probability Density Function do not change with time, i.e _ p X t 1 , p X t 2 ,...... ptk = p X t 1+ , p X t 2+ ,...... p X tk + Mean= E{X(t)} = mx(t)= constant Autocorrelation= Rx (t1, t2)= constant Nth moment = E{X(t)n} = constant.

10/29/2009

Lecture 1

11

Random Process Types

Wide ( or Weak) sense stationary (WSS): In WSS random process only two statistics ( mean and autocorrelation) do not change with time

Mean of X(t)= E{X(t)} = mx(t) = Constant Rx (t1, t2) = Rx (t1+, t2+) = Rx (t2 t1,0)= Rx ()

It means Autocorrelation only depends on difference between t1 and t2. Thus all pairs of X(t) at times separated by t2-t1 have same correlation value

10/29/2009

Lecture 1

12

Random process Types (Contd..)

Ergodic process: A random process is ergodic if its


ensemble ( statistical) and time averages are same, that is:

10/29/2009

Lecture 1

13

Random Process

10/29/2009

Lecture 1

14

Spectral density

Energy signals:
Energy spectral density (ESD):

Power signals:
Power spectral density (PSD):

Random process:

Power spectral density (PSD):


Lecture 1 15

10/29/2009

Autocorrelation

Autocorrelation of an energy signal Autocorrelation of a power signal

For a periodic signal:

Autocorrelation of a random signal

For a WSS process:

10/29/2009

Lecture 1

16

Properties of an autocorrelation function

For real-valued (and WSS in case of random signals):


1. 2. 3. 4.

Autocorrelation and spectral density form a Fourier transform pair. Autocorrelation is symmetric around zero. Its maximum value occurs at the origin. Its value at the origin is equal to the average power or energy.

10/29/2009

Lecture 1

17

Power Spectral Density and Autocorrelation of a Low Rate Signal

10/29/2009

Lecture 1

18

Power Spectral Density and Autocorrelation of a High rate Signal

10/29/2009

Lecture 1

19

Noise
It is undesired signal interfering with the desired signal. External Sources

Atmospheric Noise ( Max Freq range: 30 Mhz)


lightening

Solar Noise Cosmic Noise


( Source is sun and distant stars. Frequency range is 8Mhz1.43 GHz)

Industrial Noise (Freq Range = 1-600 MHz)


Ignition, motors, leakage from high voltage line, Fluorescent tube

10/29/2009

Lecture 1

20

Noise (Contd..)

Internal Sources

Thermal Noise
It is generated by resistive components due to random motion of electrons. Noise Power = Pn = K T B K = Boltzmanns Constant = 1.38x10 -23 joules/K T = absolute temperature, B = Bandwidth

Shot Noise
Random variations in the arrival of electrons at the output of semiconductor device.

10/29/2009

Lecture 1

21

Noise in communication systems

Thermal noise is described by a zero-mean Gaussian random process, n(t). Its probability density function is given below

Gaussian Probability density function


10/29/2009 Lecture 1 22

White Noise

The Power Spectral Density ( Gn(f) ) of thermal noise is same from DC to about 1012 Hz. Thus Gn(f) is flat for all frequencies of interest
[w/Hz]

Power spectral density

The autocorrelation of White noise is given by inverse Fourier Transform of Gn(f)

Autocorrelation function
10/29/2009 Lecture 1 23

Signal transmission through linear systems

Frequency Transfer Function Input Linear system

Output

Deterministic signals: Random signals:

10/29/2009

Lecture 1

24

Signal transmission - contd

Ideal distortion less transmission:


All the frequency components of a signal at input of a linear system are amplified (or attenuated ) and delayed by the system equally.

= Group Delay
10/29/2009 Lecture 1 25

Frequency and Impulse Responses of Ideal Filter

Ideal Lowpass filters: BW = fu - 0 = fu


Low-pass

Non-causal! fu

-fu

= 2 fu sinc[2 fu (t to )]
10/29/2009 Lecture 1 26

Frequency and Impulse responses of Ideal Filter


Ideal

Bandpass filters:

High-pass

Band-pass -fu -fl fl fu

BW = fu-fl

Filter Bands
Pass band Transition Band Stop Band

10/29/2009

Lecture 1

27

Frequency response of a Realizable Filter

Realizable filters:

RC filters

f cut off = f3dB = fu = f 0 = H( f ) = 1 1 + ( ffu ) 2

1 2 RC

10/29/2009

Lecture 1

28

Frequency Response of Realizable Filter

R1 = R2 =1 Ohm

At cut- off frequency =

H ( f ) dB = 3dB H ( f ) = .707

10/29/2009

Lecture 1

29

Bandwidth of a Passband signal

Different definition of bandwidth: d) Fractional power containment bandwidth e) Bounded power spectral density f) Absolute bandwidth

a) Half-power bandwidth b) Noise equivalent bandwidth c) Null-to-null bandwidth

(a) (b) (c) (d)


10/29/2009 Lecture 1

(e)50dB

30

Bandwidth of signal(contd)

10/29/2009

Lecture 1

31

END

Thank You

10/29/2009

Lecture 1

32

You might also like