Download as pdf
Download as pdf
You are on page 1of 18
TT Suez Canal university matics Faculty of Computers & Systems Department of Information SY’ Final Exam Form) Second Term; 2020 /2021 Program: Information System | Course: Information Theory | Course Code: 18207 Level: 2 Lecturer: Dr, Osama Farouk | Date: 17/6/2021 Total pages: 6 Total marks: 80 Time allowed: 3h. Answer the following questions: Question (Select the correct answer) (80 marks) * Of joint entropy. Let p(s, y) be given by % Y X05 0.25 Xo as] Xs 0.0625 0.06251 1, The entropy H(X)= 2585 ay @_106127 bits! symbol] B. 1.5687 bits/symbol 2bits/ symbol. 1.024 bits/ symbol 2. "The system entropy H(x, Y. A. 0.589 bits/ symbol 875 bits! symbol_| C. 0.5 bits/ symbol D. 1.35 bits/ symbol 3. The noise entropies= °$/% A. 0.87 bits! symbol B. 1.53 bits/ symbol [C) 0.81373 bits/ symbol D. 0.35 bits/ symbol 4. The losses entropies=*/ 7 @)08863 bis sebal B, 0.8521 bits/ symbal___C. Obits/ symbol. 2 bits/ symbol &. The mutually information between (xy and x)= A. -0.4213 bits/symbol B, 0.3923 bits © obits D._ -0.3923 bits 6. The transformation= 0.1749 B, 0.217 C. 0.1978 D. -0.235 ‘+ _ For X data transmitted and received as Y after passing through the channel, is the added noise by the channel AL HOY) B. I(%:¥) c. Hoty) ® xe x s the intersection between the transmitted and received data which represents the data that survived through the chann o HOX,Y) eran C. HOY) ‘D. HOY |x) the data lost through the channel and did not arrived. A. HO,Y) @rey A TY) D. HY IX) 8 9. * Consider a binary symmetric communication channel, whose input source is the alphabet X = with probabilities (0.5, 0.5}; whose output alphabet is Y= {0,1}; and whose channel matric C8) ete ae] where € is the probability of transmission erro : te "10. What is the entropy of the (x)? aaaN % a MB 0 1 D3 11. What is the entropy of this output distribution, i? fc 1 6,0 2 D.3 (0,1) Examination committee: Prof. Prof. Prof. Prof. Scanned with CamScanner Department of Infor isthe joint entropy, HOGT I-e log(e)—(1— )log(1—e) J-€ log (€)—(1-e) 1 logit =e) ~ Ite log@)—(-6) * The input source to a © d. The output from this channel ig a distribution of these two random variabl Z=0 c=) poe pad Suez Canal University | Faculty of Computers & Informatics 13, What is the mutual information of this h TY)? ia I+e log (e) + ( staneeh ‘mation Systems B. I-(1-e)log(1- 6) D.i B. 1+(1-e)log(1—€) Dt neisy communication channel is a random variable X over the four symbols a, b, random variable Y over these same four symbols. The joint les is as follows: 1 1 1 , & ie Te! 16 / a vu & ob o % 5 3 2 0 % | 1 1 % z & tr 0% 126 5 15 a H(X) in bits. cas D. 4 he marginal entropy H(Y) in bits, B. 1.25 C05 D. 225 16. What is the joint entropy H(X,Y) of the two random variables in bits? A. 3.224 B. 2.74 ©)3375 | D245 17. What is the conditional entropy H(¥|X) in bits? A L4l B. 125 c. os fea 18. What is the mutual information I(X;Y) between & fsvo.tandom variables in bits? .3" * A.0.44 B. 3.75 D. 0.54 y~ Consider a random variable X which takes 01 the values ~1, 0 ang P(0) = 3/2, P(1) = 1/4, For parts (¢) and (4) nd instanfes of this random variable, =—\er 19. Find the entropy in base 4 of this random tet Al.02 @) 5.21 20. Find the entropy in base 2 of this random Y2#ble. As B. 6.26// © We flip a fair coin twice. Let X denote the ™°°me of the 1% second flip, 21. Find H(X%Y) 1 B. 6 22. FindH(X) < pd = 4 B. 6 7 23, Find H(X/Y) ) a, 6 B'S i Brot Examination committee: Prof. (Ga 3 x c D. 1 with probabilities PC (© 984; we comider a Sequence ae D, = U4, ws, oF D. 154 Co 434 @yasr . flip aidY ai ats MB Snote the outcome of the “3 Aa! ae GEE Scanned with CamScanner Prof, Suez Canal University Faculty of Computers & Informatics Department of information Systems . . ‘) 4 MY) aS ™ OY Hei Q K " B. H( conc Brea “a Wn Variable whose entropy F(X) is 8 its. Suppose that Y(X)is a deterministic function that takes on a different value for each value of x. 27. What then is H(Y), the entropy of Y? AL 16 B.S © D.32° 28, What is HQY/X), the conditional entropy of Ygiven x2 20 BI cs D. 10 29. What is H(XIY ), the conditional entropy of Xgiven y? Bl a8 D. 10 30. What is H(X, ¥), the joint entropy of X and Y? A l6 BS 3 D. 32 C89) 2X. Cay, n an unsymmetric communication system, a z0r0 or a oo W transmitted with Pa and P(X=1) = and one ean be received as a zero with , received, 67 A. 044 BI C. 0.5625 ° 5 32. Consider a digital signal is transmitted throw igh two stages in sequence. independently. Suppose the digit zero is transmitted there is a finite probability that the input digits may probabilities of inversion for each of the two stages r 1* stage: PU 0)=1/4, PCO 1) =1/2 2" stage: PU 0)= 12, POO 1)=14 Caleulate the probability that a zero is reecived at the end of the two stage, ifthe input digit to the first stag azero? mi B. 5 c. 152 0" : Digital ‘Channel the chance that it transmitted through a digital tt ‘ansmission channel is. received he transmission trials are independent, Let Each stage performs ‘eseniee of noise in cirenits beinverted by each stage. A list of the following espectively are: sand because of pr the number of bits in error in the next Four bits transmitted - Determine standard deviation of this experiment. A. 036 B. 04 Si (D20.6_| 34, The destination of the information transmitted, Person, Computer, Disk, Analog Radio or TV internet A. Source Recei C. Noise D. Channel 35. Processing done before placing info j annel A. Code C. Decoder D. Source 36. For 64 equally likely and independents messages find the information content (in bits) in each of the messages, a 6 B.7 8 D.5 esi Prof. Prof. Examination committee: Prof. Prof er a Scanned with CamScanner Suez Canal University Faculty of Computers & Informatics Department of Information Systems 7 ining 6 symbols are chosen at random from a 33- are alphabet (23 letters, 10 numerical digits and one blank space). 37. Caleulate the information capacity for 100 words, each contai A. 2068 B. 4652 a: 4 D, 3000 propebanty Poe Ei X,Y,Z, ifthe ntgepy H=0.48 and self information is 0.6 find the B. 0.288 ci D. 02 39. When probability of error during transmission is 0.5, it indicates that ‘A. Channel is very noisy C. None of the mentioned is very noisy & No information is recei 40, The method of converting a word co stream of bits is called #5 (A) Source coding] B. Binary coding C.. Bit coding D. Cipher coding 41 For M equally likely messages, the average amount of information #8. 7+ A. H=2logioM YQ) HT logs C. He logioM D. He logioM? 42, Entropy is ‘A. Amplitude of signal )A verage information per message C. Information in a signal D. All of the above ol to encode 32 different symbols? 43, How many bits per A2 C.16 D. 32 44, The symbols A, B,C, D occur with probability 1/2, 1/4, 1/8 and 1/8 respectively . Find the self information in the message ‘BDA’ , whe ‘symbols are independent. + A. 10 B. 12 orm D.8 Uf 45. The entropy for a fair coin toss is exactly ES B. 3 bits Dz Sbits \2° A. 2bits 46, For which value(s) of ‘p is the binary entro} function H(p) maximized? B. 12 (GJos | D.0 Al 47, Avandom variable X takes the values 0 and 1 hr equal probabilities, The variance is, 6 3. 0.75 D1 Ew>? 20.25] B. 05 4 48, Self information should be - ‘A.Negative _ B. Positive & Negative C. None of the mentioned @)Bositive | hen the unit of measure of information is ithm is 2, 19 gen ase eS te ne ofthe mentioned Sty B, Byte ated Diary ets shown 1000 eras, and 3000 ones, What i zero? = 50. An analysis of a long In ability that any one digit in message 8 bt (02 B. 05 nf smabots with probability as given: ee 5 i transmitting six SY St Consider) 208) «Mal? mda), k (122) Find the aver! exe infor = 1.756 1.937 B17 , 2.254 | seal tite 52. ‘ concerned with retical rious ahd potentials of systems that co A. Probability theory ae en Quantum theory D, none ofthese 53. isa function which measures the @ formation se ° B. Entropy J Self Information | A Matual entropy CJseitinvormation) ——_p, 54. 7 if resentedn averaBe numberof Dis of information per con. of these " joint " 55, APCM source transmits four samples OMY sre pl = p2= fe 2 samples / second The 7 se 4 samples (messagi p3=p4=14- Find a ae Probabilities information of oceurrence of the source. ad &: its/second B. 2bits/seee C. 8bitsisecond py, i cr Prof. Ge vu Pr Prof. —————— 4 Examination committee: Scanned with CamScanner Department ial eatopy 57. The entropy of a discrete ra A HQ) =—~2Sep(X)log p(x) Cc. A(x) 58. For a given event with entropy H,(X) A. bits B. 4-its 59. The followin, TOE eeee Suez Canal University 7 Faculty of Computers & Informatics € amount of information B. Entropy indom variable X is defined by BHO) = ~4y, p(x)io, HO) = Yer Wlog pix = 1.56, the units ig statements describe which quanti Of Information Systems le from the other. (©) Self Information D. none of these C.e-its 1 HO, ¥) Sela D. HKIY) 7 60. Which of the following is NOT representative of ‘the mutual information? 2 g oy iY) = HX) — Hex —B.10G ¥) = (¥; x) VK “al wD. IK; Y) = HO) + HY) H XD ae or Pur source to a noisy communica a rong EF beand c. The output from this eh, joint distribution of these two va © 05664 iC] X) = 1.52832 C. HCY|30 = 2'58496 D. HEYPO = 1.75821 62. Suppose that the following sequence “Game of 7 questions” to learn which of the lett that their a priori probabilities were known: “Is it A?” ‘als it a member of the set {B, C}2” “Is it a member of the set {D, E}?” “IsitF2” What is the entropy of this alphabet? 275 B. 2.55 63, The channel capacity is ....... } maximum information transmitted by one symbol over the channel Information contained in a signal wunication channel is a ra ‘annel is iriables is as follows. What indom variable X over the three symbols a, ble ¥ over these same three symbols. The is the value of H(¥| X)? random varia ST. | GED Taleo "Conary of YesiNb questions Was an optimal strate ters (A, B,C, D, gy for playing the E, F, G) someon as ¢ had chosen, given “No? wloa 5° 49 ene wlyr,9 249 “No. Wy 229 P49 225 D. 252 C. The amplitude of the modulated signal D. All of the above 64.7 ity of a binary symmetric channel, given Eo) is binary entropy function is B. H)-1 Cc. 1-H, D. HOY -1 65. The information rate A. 8B bits/sec 66. Information rate is defined as Information per unit time B. Avera; C. rH 'D. All of the above R for given average information H= B._ 4B bits/sec 2B bits/sec D. for analog signal band 16B bits/sec ge number of bits of information per second Examination committee: Prof. Prok Prof. Prof. rof. Scanned with CamScanner limited to B Fiz ‘Suez Canal University aia Faculty of Computers & Informatics i % Department of Information Systems ‘67. The mutual information A. Is symmetric Always non negative (» 7 Qa Se ). None of the above x o .d mutual information is H(X/Y) - HOPX) 68. The relation between entropy a 1) = HO) HOM C. 1KY) =H) - HO) 6 Etro ormation per message B: Infomation inasignal C. Amplitude of signal D. All ofthe 0% ined in a message with probability of occurrence fs 17% by (kis constant) eating cri tions? C. T=Klogsl/2P D. I=klogs//P2 aie expected Information contained in A message scale X. Enopy B. Efficiency ‘C. Coded signal __D, None of the above 77 te rmation rat basealy gives an idea about the generated information POF «== IY source B, Minute C. Hour D. none of these 3. The unit of average mutual information iS vss BBE C.Bits per symbol DB. Byt persymbior 74\When X and Y are statistically independent, then I(x, y)is Al c. Ln? P< Cannot be determined 75. The self information of random variable is Al B. 0 @© infinite | D. Cannot be determined 76. Entvapy of random variable is Al BO (©intinite D, Cannot be determined Fie The nit of joint entropy Hx) terme [e:Bispersumbsll D. Bytes per symbol ofa random variable is a measure of central location. By Variance- C. Standard Deviation D. none of these 79, The memory less source refers to No previous information B. No mess: N age stora (SStinited message is independent of previous message D. Noneot the above, epresentation of information in another form, (encode C, Source D. Receiver 78. yest wishes De, Otana Farouk Nf Examination committee: Prof. Prof. Prof. . Prof, Scanned with CamScanner fl Leal | all) 25x tolale, Gee S0-B+4tf abv eee = |X 1 (£05, 0-8)-EGE A 93! HO = HO) 4 AO) - HOY) @) = [t= (1-G-D)e(1-E)-E05,£)= | (HE) ACH EOE CamScanner + Wis &suzal CamScanner 2 Lis-2 &gusall — ‘ CamScanner = Lis &sauaall CamScanner = CamScanner 2 Lis &gusall Tr! (tal, & * i CamScanner + las &suaall CamScanner 2 bis. &>gusall

You might also like