Download as pdf or txt
Download as pdf or txt
You are on page 1of 4

‭Digital architecture and Designs questions‬

‭1.‬
‭ ) talk about the incomplete if statement‬
1
‭2) find two errors in the following code :‬
‭1st error : port map defined in a positional way but the output and input placement were‬
‭inverted .‬
‭2nd error : the clk and reset process were defined in such a way that if you click reset and‬
‭clock signal at the same time , your output will have two different statements fed to it at the‬
‭same time .‬
‭3) how will you create a pipelined version of the project.‬

‭2.‬
‭ ) what is the difference in logic synthesis between an unmapped and mapped logic circuit‬
1
‭2) I had a test bench of a D flip flop‬
‭1st error : port map defined in a positional way but the output and input placement were‬
‭inverted‬
‭2nd error :‬
‭The signals were defined in the entity of the test bench which is not allowed‬
‭3) in which case is it good to put all the components in series? (For pipelining and running‬
‭multiple decryption at the same time)‬

‭3.‬
‭ y questions:‬
M
‭1) Worst negative slack and totally negative slack -> consequence on the implementation‬
‭2) i had a "when others missing" and when the second process (simple reset) is changed at‬
‭the same time as the first process then probem. Solution: putting the process together‬
‭3) in my case, the 4 first block (addkey,...) are in series. How do i put them "together" ->‬
‭syate machine‬

‭4.‬
‭ y questions‬
m
‭1- Explain the two VHDL processes for implementing the state machine‬
‭2- Code mistakes : « when others » was missing in a case + incomplete if‬
‭3- How to reduce the numbers of components instead of repeating them in series ? (four‬
‭components for the four modules) =>state machine‬
‭Signal theory questions‬

‭1.‬
‭ or signal theory: you start with top of your head questions, 2min , closed book‬
F
‭Then you have 40min+ to prepare for two questions, fully open book. The questions are‬
‭slightly harder variants of the questions in the list.‬
‭During the oral, you will go over your notes and then he asks a bonus question at the end. I‬
‭got prefix code+ krafts inequality‬

‭2.‬
‭Additive white gaussian noise channel capacity‬

‭3.‬
‭ or the signal theory exam. I had the question 6 and 68. For the question 6, he went further‬
F
‭by asking me questions about entropy for a Markov process. For the question 68, he went‬
‭further by asking me questions about the property of the autocorrelation for WSS processes.‬
‭The bonus question was to explain prefix code, kraft inequality and Huffman coding. Also the‬
‭teacher gave me my grade at the end of the exam‬

‭4.‬
‭ he OTTOYH questions are actually simpler to remember than I thought, as he gives you‬
T
‭50% of the answer for each question. I got questions 19 and 82, nothing special. He saw‬
‭that I drew a Venn diagram so he asked me some questions about it (like derive the mutual‬
‭information using the mutual entropy). I had to discuss the three coding methods in detail, he‬
‭asked me more about the prefix codes (what is it, then he showed me two codes and I had‬
‭to tell which one was a prefix code). He asked me the difference between a matched and‬
‭Wiener filter, and the expression of the Wiener filter if the noise and signal were uncorrelated‬
‭(+ how to get the impulse response of the filter). He asked me what were the optimal‬
‭distributions in continuous time. I had like 6 bonus questions in total lmao.‬

‭5. Question 73‬


‭ e asks me questions about, Wiener Kinchin therom, draw the autocorelation in the time‬
H
‭domain, what is RST signal‬
‭I had question 18‬
‭He asks me about, the channel capacity (definition with mutual information and for AWGN,‬
‭with SNR) , shanon encoding theorem‬
‭And then the non prepared was the 3 different encoding process, to describe how does it‬
‭work?‬
‭6. 22,73‬
‭ rof. wants you to explain the formula and where comes from.‬‭draw the autocorelation in‬
p
‭the time domain.‬‭also what is diff between Wiener‬‭filter and Matched filter. and the‬
‭Shannon code algorithm and question 21.‬

‭7. Question 2, 72‬


‭ e asked me link between discrete time and continuous time, the assumptions that we have‬
H
‭to do.‬
‭As next question after demonstration, he asked me to describe the differents binary‬
‭encoding‬

‭8.‬
‭ uestions 12 and 70‬
Q
‭Starting from question 12 he asked : mutual information (what it is, write one of the possible‬
‭expressions starting from the entropy and highlight it using Venn Diagrams) + definition of‬
‭channel capacity in general case and in case of additive noise + channel capacity in case of‬
‭bandlimited signal and noise (the one with SNR) + shannon theorem (R<C) + how to‬
‭maximise the entropy (both cases of bounded amplitude and limited power)‬
‭From question 70 he asked the Wiener - Khinchin theorem + conditions on when it can be‬
‭applied (WSS) + difference between Matched filter and Wiener Filter both regarding the‬
‭structure and the applications (focus on the fact that you need to know the input signal for‬
‭the matched filter while for matched filter you don’t know the input signal and it is stochastic).‬
‭3rd question -> definition of prefix code + 3 coding techniques + he wrote the probabilities on‬
‭paper and he asked me to use the Huffman encoding algorithm.‬
‭Lots of questions but he makes you feel comfortable and if you can answer he is ready to‬
‭give you high scores.‬

‭ .‬
9
‭3, 68‬
‭Question on entropy for bernouilli processes and say that the white gaussian noise is‬
‭strongly stationary and ergotic in…‬
‭A lot of OTTOYH questions so if you know them all u gucci.‬
‭Then he asked me about the 3 binary encoding we saw shannon, huffman and I don’t‬
‭remember the last. So you just had to explain the 3 encoding system, tell him how much‬
‭symbols we would encode (there’s a formula). We just have to explain the principle and draw‬
‭how the thing work and that’s done‬

You might also like