Professional Documents
Culture Documents
Lecture IV - Entropy and Information Theory
Lecture IV - Entropy and Information Theory
and
Information Theory
Tahereh Toosi
IPM
Recap
Stimulus parameter S2
Estimate of stimulus
given spike train accessible region for
all stimuli in P[S]
10
11
Entropy
𝒉(𝑷
[𝒓 ])=−𝒍𝒐𝒈 𝟐 𝑷 [𝒓 ] quantifies the surprise or unpredictability associated
with a particular response
12
The entropy of a binary code
13
Mutual Information
The entropy of the responses to a given stimulus :
Mutual Information
14
How to use Information Theory:
1. Show your system stimuli.
2. Measure neural responses.
3. Estimate: P( neural response | stimulus presented )
4. From that, Estimate: P( neural repsones )
5. Compute: H(neural response) and
H(neural response | stimulus presented)
6. Calculate: I(response ; stimulus)
15
Entropy : spike count
Stimuli responses spike count
11001010 4
01000110 3
00110111 5
11000000 2
16
How to screw it up:
• Choose stimuli which are not representative.
• Measure the “wrong” aspect of the response.
• Don’t take enough data to estimate P( ) well.
• Use a crappy method of computing H( ).
• Calculate I( ) and report it without comparing it
to anything...
17