Task Performance (Principles)

You might also like

Download as docx, pdf, or txt
Download as docx, pdf, or txt
You are on page 1of 1

Armin D.

Mendonez

BT303

Task Performance
Answer:

a. Input Entropy: The input entropy is calculated using the formula: H(X) = -Σ p(x) * log2(p(x)) where X is
the random variable representing the input states, and p(x) is the probability of each input state. In this
case, all four input states are equiprobable, so each p(x) = 1/4. Therefore, the input entropy is: H(X) = -Σ
(1/4) * log2(1/4) = 2 bits

b. Noise Entropy: Similarly, the noise entropy is calculated using the same formula, but for the random
variable representing the noise values: H(N) = -Σ p(n) * log2(p(n)) where N is the random variable
representing the noise values, and p(n) is the probability of each noise value. Again, all three noise
values are equiprobable, so each p(n) = 1/3. Therefore, the noise entropy is: H(N) = -Σ (1/3) * log2(1/3) ≈
1.585 bits

c. Outputs of each equiprobable input states: The outputs are simply the sum of the corresponding input
state and noise value. For example, the output of the first input state (35) with the first noise value (5) is
35 + 5 = 40.

Input State Noise Value Output

35 5 40
35 10 45
35 15 50
65 5 70
65 10 75
65 15 80
95 5 100
95 10 105
95 15 110
125 5 130
125 10 135
125 15 140

d. Output Entropy: The output entropy is calculated using the same formula as the input and noise
entropy, but for the joint probability distribution of the input and noise values. This can be a bit more
complex to calculate, but fortunately, we can use the shortcut that the entropy of the sum of two
independent random variables is equal to the sum of their individual entropies.

Therefore, the output entropy is:

H (X + N) = H(X) + H(N) = 2 bits + 1.585 bits ≈ 3.585 bits

You might also like