Professional Documents
Culture Documents
DC - Unit 1 - Final
DC - Unit 1 - Final
REGULATION 2017
V SEMESTER ECE A & B
AY-2020-21
BATCH: 2018-22
Handled
By
MS.P. MALATHI,
ASSOC.PROF., ECE
SCOPE OF THE COURSE
Core/ITES companies offering Opportunities in Government organization:
jobs with a strong knowledge in Indian
Electronic Circuits: engineering services (IES) Examination.
Samsung BSNL JTO/TTA
Robert Bosch DRDO
Dell ISRO
Fujitsu BEL
IBM ONGC
Panasonic Power Grid Corporation of India Limited
Havells Hindustan Aeronautics Limited (HAL)
Microsoft National Thermal Power Corporation (NTPC)
Bajaj Electronics Doordarshan
Intel All India Radio (AIR)
Texas Instruments Bharat Heavy Electricals Limited (BHEL)
National Instruments Higher Studies in India:
Microchip GATE
HCL,Wipro
PRE-REQUISITES
EC 8252- II SEMESTER- ELECTRONIC DEVICES
EC 8351- III SEMESTER- ELECTRONIC CIRCUITS I
EC 8392-III SEMESTER- DIGITAL ELECTRONICS
OBJECTIVES
NPTEL LINK:
1. https://nptel.ac.in/courses/117/101/117101051/
2. https://nptel.ac.in/courses/108/102/108102096/
3. https://nptel.ac.in/courses/108/101/108101113/
WEB RESOURCES:
1. https://www.youtube.com/watch?v=Z0Ylnk8zXRo
2. https://www.youtube.com/watch?v=qhjj6WG7Rgc
3. https://www.youtube.com/watch?v=S8X49TQxH-o
4. https://swayam.gov.in/nd1_noc20_ee17/preview
INTERLINKING OF ALL UNITS
Information Theory
•UNIT 1
•UNIT 2
Error Control Codes
•UNIT 3 Waveform coding
•UNIT 4
•UNIT 5
Baseband Transmission
and Reception
Information –
Measure of information
• Entropy of random variable X
– A measure of uncertainty or ambiguity in X
K is a positive constant.
H=0: no uncertainty
H=1: most uncertainty
1 bit for binary information
V SEMESTER- REGProbability
2017- EC p 8501- DIGITAL COMMUNICATION-
16
UNIT I-CLASS 1
PRATHYUSHA ENGINEERING COLLEGE
DEPARTMENT OF ECE
Mutual information
• Two discrete random variables: X and Y
Some properties
Entropy is maximized
when probabilities are equal
• Therefore,
• If Xi are iid
V SEMESTER- REG 2017- EC 8501- DIGITAL COMMUNICATION-
21
UNIT I-CLASS 2
PRATHYUSHA ENGINEERING COLLEGE
DEPARTMENT OF ECE
6.3 Lossless coding of information source
P(x1)
P(x2)
P(x3)
P(x4)
P(x5) x1 00
P(x6) x2 01
P(x7) x3 10
x4 110
H(X)=2.11 x5 1110
R=2.21 bits per symbol x6 11110
V SEMESTER- REG 2017- EC 8501- DIGITAL COMMUNICATION-
UNIT I-CLASS 3 x 7 11111
28
PRATHYUSHA ENGINEERING COLLEGE
DEPARTMENT OF ECE
A channel is memoryless if
Source Output
data data
xM-1
…… can be arranged
yQ-1
in a matrix
V SEMESTER- REG 2017- EC 8501- DIGITAL COMMUNICATION-
32
UNIT I-CLASS 3
PRATHYUSHA ENGINEERING COLLEGE
DEPARTMENT OF ECE
• Power constraint
• For input sequence with large
n
Input Output
waveform waveform
• Power constraint
Equivalent to 2W
uses per second of
a discrete-time
channel
V SEMESTER- REG 2017- EC 8501- DIGITAL COMMUNICATION-
37
UNIT I-CLASS 3
PRATHYUSHA ENGINEERING COLLEGE
DEPARTMENT OF ECE
• Hence,
Channel capacity
• After source coding, we have binary sequency
of length n
• Channel causes probability of bit error p
• When n->inf, the number of sequences that
have np errors
Channel capacity
• To reduce errors, we use a subset of all
possible sequences
Capacity of
binary channel
Channel capacity
We cannot transmit more
than 1 bit per channel use
Channel capacity
• Capacity for abitray discrete memoryless channel
Channel capacity
For binary symmetric channel
Channel capacity
Discrete-time AWGN channel with an input
power constraint
For large n,
Channel capacity
Discrete-time AWGN channel with an input power constraint
Transmission rate
bits/channel use
bits/s
V SEMESTER- REG 2017- EC 8501- DIGITAL COMMUNICATION-
47
UNIT I-CLASS 3
PRATHYUSHA ENGINEERING COLLEGE
DEPARTMENT OF ECE
Channel capacity
Adopted from
Gatestudy.com
GATE QUESTIONS
Adopted from
Gatestudy.com
GATE QUESTIONS
Adopted from
Gatestudy.com
PREVIOUS YEAR AU QUESTIONS-PART A
1. What is entropy and give its mathematical 11. Describe the concept of discrete
equation. memoryless channel
2. Define source coding. State the significance of 12. List out the properties of Hamming
source coding distance.
3. What is BSC? 13. Evaluate the Hamming distance
4. Why is Huffman code called as minimum between the following code words
redundancy code? C1={1,0,0,0,1,1,1} and
5. An event has six possible outcomes with C2={0,0,0,1,0,1,1}.
probabilities {1/2, 1/4, 1/8, 1/16, 14. State the properties of mutual
1/32, 1/32}. Solve for the entropy of the information
system. 15. Examine the types of discrete
6. Outline the concept of discrete memoryless memoryless channel
source. 16. Give the main idea of Channel Capacity
7. Calculate the amount of information if 𝑝𝑘 =1/4. 17. Summarizeshannon’s law
8. Identify the properties of entropy 18. Formulate the steps involved in
9. Describe information rate? Shannon Fano coding
10. Interpret the theory of mutual information 19. Distinguish the various source coding
techniques.
20. Revise the steps involved in Huffman coding
PREVIOUS YEAR AU QUESTIONS-PART B
Enumerate Shannon’s Fano algorithm and Huffman coding with a suitable
1
example.
Five symbols of the alphabet of discrete memory less source and their
2 probabilities are given below, S={S0, S1,S2,S3,S4)
P(S)={0.4,0.19,0.16,0.15,0.15}. Predict the symbols using Huffman coding and
calculate the average codeword length and efficiency.
Illustrate the following with equations
(i) Uncertainity
3 (ii) Information
(iii) Entropy and it’s properties
Five symbols of the alphabet of discrete memory less source and their
11 probabilities are given below, S={S0, S1,S2,S3,S4)
P(S)={0.4,0.2,0.2,0.1,0.1}. Show the symbols using Shannon Fano Coding
and calculate the average codeword length and efficiency.
A telephone channel has a bandwidth of 3 kHz .
12 Predict channel capacity of the telephone channel for a SNR of 20 Db.
Estimate minimum SNR required to support a rate of 5 kbps.
A telephone channel has a bandwidth of 3 kHz and output SNR of 20 dB. The
source has a total of 512 symbols and the occurrence of all symbols are
equiprobable. Point out the following
13 channel capacity
Information content per symbol.
maximum symbol rate for which error free transmission is possible.
PREVIOUS YEAR AU QUESTIONS-PART C
The source of information A generates the symbols {A0, A1, A2, A3&
1
A4} with the corresponding probabilities {0.4, 0.3, 0.15, 0.1 and
0.05}. Evaluate the code for source symbols using Huffman and
Shannon-Fano encoder and compare its efficiency.
The source of information A generates the symbols {A0, A1, A2, A3 ,
2
A4 , A5} with the corresponding probabilities {0.45, 0.41, 0.4, 0.3 ,
0.29 and 0.05}. Evaluate the code for source symbols using Huffman
and Shannon-Fano encoder and compare its efficiency.