Professional Documents
Culture Documents
RI Lecture1
RI Lecture1
RI Lecture1
Introduction
Tim Willems
University of Oxford
2013
H (X ) (= H (p )) = ∑ p (x ) log p (x )
x 2X
= Ep log p (x )
A measure of uncertainty/unpredictability/information in X
How di¢ cult is it to describe X ?
Entropy is the average length of the shortest description of the random
variable
Base of log does not matter
With log = log2 , H (X ) is expressed in bits
With log = ln, H (X ) is expressed in nats (where 1 nat 1.44 bits;
e 1 = 2b , b = ln12 )
Tim Willems () Rational Inattention 2013 9 / 32
Entropy (ii)
H (X ) = ∑ p (x ) log p (x )
x 2X
1 w .p. p
X =
0 w .p. 1 p
Then:
H (p ) = p log2 p (1 p ) log2 (1 p)
Note that:
H (1/2) = 1 bit
X N (µX , σ2X )
Then:
1
H (X ) = log2 2πeσ2X
2
I (X ; S ) = H (X ) H (X jS )
I (X ; S ) = I (S; X )
Lucas island model works from the assumption that agents can only
observe the current state of monetary policy with a delay
But in reality this information lag is short
Lucas island model has di¢ culties in explaining persistent business
cycle ‡uctuations
Sims (2003): when agents can’t attend to all information, there is a
di¤erence between available information and information re‡ected in
decisions
n o
Using that pit = E∆,z pit jS∆it , Szit , this means:
γ n o 2
min E∆,z ,S ∆ ,Sz p E∆,z pit jS∆it , Szit
f (∆,S ∆ ),f (zi ,S z ) 2 it
γ n o 2
min ES ∆ ,Sz E∆,z pit E∆,z pit jS∆it , Szit S∆it
f (∆,S ∆ ),f (zi ,S z ) 2
γ n h io
= min ES ∆ ,Sz Var pit S∆it , Szit
f (∆,S ∆ ),f (zi ,S z ) 2
γ h i
= min Var pit S∆it , Szit
f (∆,S ∆ ),f (zi ,S z ) 2
2 2
1 σ∆ 1 σz
= log2 + log2 κ
2 σ2∆jS 2 σ2z jS
∆it zit
2 + σ2
!
2
σ∆ + σε 2
1 1 σ z ψ
= log2 + log2 κ
2 σ2 2 σ2ψ
| {z ε } | {z }
κ∆ κz
Lagrangean formulation:
σ2∆jS + α22 σ2z jSzit
∆it
L= 1 σ2∆ σ2z
+λ 2 log2 2
σ ∆ jS
+ 12 log2 σ2z jS
κ
∆it zit
Solution (check!):
Or, equivalently:
1 1 σ2∆
κ∆ = κ + log2
2 4 α22 σ2z
Intuition?