Professional Documents
Culture Documents
Unit I Digital Communcation R2017 MCQ
Unit I Digital Communcation R2017 MCQ
ANSWER: d
ANSWER: d
ANSWER: a
4. The technique that may be used to increase average information per bit is
a. Shannon-Fano algorithm
b. ASK
c. FSK
d. Digital modulation techniques
ANSWER: a
ANSWER: a
6. The information rate R for given average information H= 2.0 for analog signal band limited to B Hz is
a. 8 B bits/sec
b. 4 B bits/sec
c. 2 B bits/sec
d. 16 B bits/sec
ANSWER:(b)
7. Information rate is defined as
a. Information per unit time
b. Average number of bits of information per second
c. rH
d. All of the above
ANSWER: (d)
ANSWER: (c)
ANSWER:(a)
10. Entropy is
a. Average information per message
b. Information in a signal
c. Amplitude of signal
d. All of the above
ANSWER: (a)
ANSWER: (c)
12. The information I contained in a message with probability of occurrence is given by (k is constant)
a. I = k log2(1/P)
b. I = k log2(2P)
c. I = k log2(1/2P)
d. I = k log2(1/P2)
ANSWER:(a)
13. The expected information contained in a message is called
a. Entropy
b. Efficiency
c. Coded signal
d. None of the above
ANSWER: (a)
ANSWER: (a)
ANSWER: (c)
ANSWER:(b)
ANSWER: (a)
18 . The capacity of a binary symmetric channel, given H(P) is binary entropy function is
a. 1 – H(P)
b. H(P) – 1
c. 1 – H(P)2
d. H(P)2 – 1
ANSWER:(a)
19. Entropy of a random variable is
a) 0
b) 1
c) Infinite
d) Cannot be determined
Answer: c
Answer: b
ANSWER: (d)
ANSWER: (a)
23.For M equally likely messages, M>>1, if the rate of information R ≤ C, the probability of error is
a. Arbitrarily small
b. Close to unity
c. Not predictable
d. Unknown
ANSWER: (a)
24. For M equally likely messages, M>>1, if the rate of information R > C, the probability of error is
a. Arbitrarily small
b. Close to unity
c. Not predictable
d. Unknown
ANSWER: (b)
25. The value of the probability density function of random variable is
a) Positive function
b) Negative function
c) Zero
d) One
Answer: a
Answer: b
27. The process of data conversion along with formatting the data is called as ______
a) Formatting
b) Modulation
c) Source coding
d) Amplifying
Answer: c
28. The process that transforms text into binary digits is called as _______
a) Binary coding
b) Data coding
c) Character coding
d) Sampling
Answer: c
29. For the number of bits, k=1 and number of symbols, M=2 the system is called as
a) Unary
b) Binary
c) Quarternary
d) None of the mentioned
Answer: b
Answer: c
31. Entropy calculation returns
a) Random variable
b) Deterministic number
c) Random number
d) None of the mentioned
Answer: b
32. Entropy of N random variables is the _____ of the entropy of individual random variable.
a) Sum
b) Product
c) Sum of squares
d) Average
Answer: a
Answer: c
Answer: a
Answer: c
Answer: a
37. The event with minimum probability has least number of bits.
a) True
b) False
Answer: b
38. When the base of the logarithm is 2, then the unit of measure of information is
a) Bits
b) Bytes
c) Nats
d) None of the mentioned
Answer: a
Answer: b
Answer: c
41. Coded system are inherently capable of better transmission efficiency than the uncoded system.
a) True
b) False
Answer: a