Professional Documents
Culture Documents
Reasoning About Uncertainty Entropy
Reasoning About Uncertainty Entropy
Reasoning About Uncertainty Entropy
1 2
3 4
Conditional entropy Conditional entropy: fun facts
Entropy of X given Y = y :
X
H(X|y ) = − Pr[x|y ] log2 (Pr[x|y ])
x∈X
• H(X, Y) = H(Y) + H(X|Y).
Conditional entropy (or equivocation): weighted average of the • H(X|Y) ≤ H(X) with equality iff X and Y are independent.
above for each y ∈ Y :
We can not increase uncertainty about X by extra knowledge of Y.
X X X
H(X|Y) = Pr[y ]H(X|y ) = Pr[y ](− Pr[x|y ] log2 (Pr[x|y ])) Example: a random 32-bit integer has H(X ) = 32, but if we know
y ∈Y y ∈Y x∈X that it is odd, the uncertainty is reduced by 1 bit.
XX
= − Pr[y ]Pr[x|y ] log2 (Pr[x|y ])
y ∈Y x∈X
5 6
7 8
Silly crypto revisited Spurious keys
A spurious key is a key which is not the correct one but which
produces a “meaningful” message.
For the silly crypto example, H(M) ≈ 0.81, H(K) = 1.5,
H(C) ≈ 1.85, so H(K|C) ≈ 0.46 (by above). Example: shift cipher, c=”WNAJW”. Shift 5 gives “river”, shift 22
Exercise: verify, using definitions. gives “arena”.
9 10
13 14