Download as pdf or txt
Download as pdf or txt
You are on page 1of 5

J Comput Neurosci (2011) 30:1–5

DOI 10.1007/s10827-011-0314-3

Information theory in neuroscience


Alexander G. Dimitrov · Aurel A. Lazar ·
Jonathan D. Victor

Published online: 29 January 2011


© Springer Science+Business Media, LLC 2011

Information Theory started and, according to some, foreshadowed future work on what can be termed
ended with Shannon’s seminal paper “A Mathematical “Neural Information Flow”—how much information
Theory of Communication” (Shannon 1948). Because its moves through the nervous system, and the constraints
significance and flexibility were quickly recognized, there that information theory imposes on the capabilities of
were numerous attempts to apply it to diverse fields neural systems for communication, computation and
outside of its original scope. This prompted Shannon behavior. A second set of manuscripts, by Attneave
to write his famous essay “The Bandwagon” (Shannon (1954) and Barlow (1961), discussed information as
1956), warning against indiscriminate use of the new a constraint on neural system structure and function,
tool. Nevertheless, non-standard applications of Infor- proposing that neural structure in sensory system is
mation Theory persisted. matched to statistical structure of the sensory environ-
Very soon after Shannon’s initial publication (Shannon ment, in a way to optimize information transmission.
1948), several manuscripts provided the foundations This is the main idea behind the “Structure from Infor-
of much of the current use of information theory mation” line of research that is still very active today. A
in neuroscience. MacKay and McCulloch (1952) ap- third thread, “Reliable Computation with Noisy/Faulty
plied the concept of information to propose limits of Elements”, started both in the information-theoretic
the transmission capacity of a nerve cell. This work community (Shannon and McCarthy 1956) and neuro-
science (Winograd and Cowan 1963). With the advent
of integrated circuits that were essentially faultless,
A. G. Dimitrov (B) interest began to wane. However, as IC technology
Department of Mathematics and Science Programs, continues to push towards smaller and faster computa-
Washington State University Vancouver, tional elements (even at the expense of reliability), and
14204 NE Salmon Creek Ave,
Vancouver, WA 98686, USA as neuromorphic systems are developed with variability
e-mail: alex.dimitrov@vancouver.wsu.edu designed in (Merolla and Boahen 2006), this topic is
gaining in popularity again in the electronics commu-
A. A. Lazar nity, and neuroscientists again may have something to
Department of Electrical Engineering, Columbia University,
Mail Code 4712, contribute to the discussion.
500 West 120th Street,
New York, NY 10027, USA
e-mail: aurel@ee.columbia.edu
1 Subsequent developments
J. D. Victor
Division of Systems Neurology and Neuroscience, The theme that arguably has had the widest influence
Department of Neurology and Neuroscience, on the neuroscience community, and is most heavily
Weill Cornell Medical College,
represented in the current special issue of JCNS, is
1300 York Avenue,
New York, NY 10065, USA that of “Neural Information Flow”. The initial works of
e-mail: jdvicto@med.cornell.edu MacKay and McCulloch (1952), McCulloch (1952) and
2 J Comput Neurosci (2011) 30:1–5

Rapoport and Horvath (1960) showed that neurons are At the same time, several other branches of Infor-
in principle able to relay large quantities of informa- mation Theory saw application in neuroscience context.
tion. This research lead to the first attempts to charac- The introduction of techniques stemming from work on
terize the information flow in specific neural systems quantization and lossy compression (Gersho and Gray
(Werner and Mountcastle 1965), and also started the 1991) provided lower bounds of information-theoretic
first major controversy in the field, which still resonates quantities and ideas about inference based on them
today: the debate about timing versus frequency codes (Dimitrov and Miller 2001; Samengo 2002; Tishby et al.
(Stein 1967; Stein et al. 1972). A steady stream of 1999). Furthermore, a large class of neuron models
articles followed, both discussing these hypothesis and were characterized as samplers that, under appropriate
attempting to clarify the type of information relayed by conditions, faithfully encode sensory information in the
nerve cells (Abeles and Lass 1975; Eagles and Purple spike train (time encoding machines, Lazar 2004). The
1974; Eckhorn and Pöpel 1974; Eckhorn et al. 1976; class of neurons includes integrate-and-fire (Lazar and
Harvey 1978; Lass and Abeles 1975; Norwich 1977; Pnevmatikakis 2008), threshold-and-fire (Lazar et al.
Poussart 1971; Stark et al. 1969; Taylor 1975; Walloe 2010) and Hodgkin-Huxley neurons (Lazar 2010).
1970).
After the initial rise in interest, the application of
Information Theory to neuroscience was extended to 2 Current state
a few more systems and questions (Eckhorn and Pöpel
1981; Eckhorn and Querfurth 1985; Fuller and Williams The work presented in the current issue builds on the
1983; Kjaer et al. 1994; Lestienne and Strehler 1987, developments in both information-theoretic and exper-
1988; Optican and Richmond 1987; Surmeier and imental techniques.
Weinberg 1985; Tsukuda et al. 1984; Victor and Several of the contributions apply a recent develop-
Johanessma 1986), but did not spread too broadly. This ments in Information Theory - directed information—
was presumably because, despite strong theoretical to clarifying the structure of biological neural networks
advances in Information Theory, its applicability was from observations of their activity. The works origi-
hampered by difficulty in measuring and interpreting nate from the ideas of Granger (1969) on causal inter-
information-theoretic quantities. actions, and was placed in an information-theoretic
The work of de Ruyter van Steveninck and Bialek perspective by Massey (1990), Massey and Massey (2005)
(1988) started what could be called the modern era and Schreiber (2000). Amblard and Michel (2011)
of information-theoretic analysis in neuroscience, in here merge the two ideas and extract Granger causal-
which Information Theory is seeing more and more ity graphs by using directed information measures.
refined applications. Their work advanced the concep- They show that such tools are needed to analyze the
tual aspects of the application of information theory to structure of systems with feedback in general, and
neuroscience and, subsequently, provided a relatively neural systems specifically. The authors also provide
straightforward way to estimate information-theoretic practical approximations with which to estimate these
quantities (Strong et al. 1998). This work provided an structures. Quinn et al. (2011) present a novel, non-
approach to removing biases in information estimates linear robust extension of the linear Granger tools
due to finite sample size, but the scope of applicability and use it to infer the dynamics of neural ensembles
of their approach was limited. The difficulties in obtain- based on physiological observations. In particular, the
ing unbiased estimates of information-theoretic quanti- procedure uses point process models of neural spike
ties were noted early on by Carlton (1969) and Miller trains, performs parameter and model order selec-
(1955) and brought again to attention by Treves and tion with minimal description length, and is applied
Panzeri (1995). However, it took the renewed interest to the analysis of interactions in neuronal assemblies
generated by Strong et al. (1998) to spur the research in the primary motor cortex (MI) of macaque mon-
that eventually resolved them. Almost simultaneously, keys. Vicente et al. (2011) investigate transfer entropy
several groups provided the neuroscience community (TE) as an alternative measure of effective connectiv-
with robust estimators valid under different conditions ity to electrophysiological data based on simulations
(Kennel et al. 2005; Nemenman et al. 2004; Paninski and magnetoencephalography (MEG) recordings in a
2003; Victor 2002). This diversity proved important, as simple motor task. The authors demonstrate that TE
Paninski (2003) proved an inconsistency theorem show- improved the detectability of effective connectivity for
ing that most common estimation techniques can non-linear interactions, and for sensor level MEG sig-
encounter conditions that lead to arbitrary poor nals where linear methods are hampered by signal-
estimates. cross-talk due to volume conduction. Using neocortical
J Comput Neurosci (2011) 30:1–5 3

column neuronal network simulations, Neymotin et al. between spike trains, Gillespie and Houghton (2011)
(2011) demonstrate that networks with greater inter- present a novel method for calculating the information
nal connectivity reduce input/output correlations from channel capacity of spike trains channels. As an exam-
excitatory synapses and decrease negative correlations ple, they calculate the capacity of a data set recorded
from inhibitory synapses. These changes were mea- from auditory neurons in zebra finch. Dimitrov et al.
sured by normalized TE. Lizier et al. (2011) take the (2011) present a twofold extension of their Informa-
TE idea further and apply it to functional imaging tion Distortion method (Dimitrov and Miller 2001)
data to determine the direction of information flow as applied to the problem of neural coding. On the
between brain regions. As a proof of principle, they theoretical side, they develop the idea of joint quan-
show that this approach enables the identification of a tization that provides optimal lossy compressions of
hierarchical (tiered) network of cerebellar and cortical the stimulus and response spaces simultaneously. The
regions involved in motor tasks, and how the functional practical application to neural coding in the cricket
interactions of these regions are modulated by task cercal system introduces a family of estimators which
difficulty. provide greater flexibility of the method for different
As mentioned above, the “structure from informa- systems and experimental conditions.
tion” theme is represented by a robust stream of inves- And finally, Lewi et al. (2011) use Information The-
tigations, centered on the notion that neural circuitry ory to approach the question “What are the best ways
exploits the statistical features of the environment to study a biological system?”, when the goal is to char-
to enable efficient coding of sensory stimuli (Barlow acterize the system as efficiently as possible. Tzanakou
1961). The overwhelming majority of studies concern et al. (1979) initially posed it in the form “Which is the
the implications of this principle for processing at a ‘best’ stimulus for a sensory system?”. More recently,
neuronal level (e.g., Atick and Redlich 1990). In this Machens (2002) rephrased the problem into “What is
issue, however, Vanni and Rosenström (2011) consider the best stimulus distribution that maximizes the infor-
a larger scale of organization. Through the use of func- mation transmission in a sensory system?”. This paper
tional imaging techniques to assess the distribution of takes the authors’ general formulation (Paninski 2005),
activity levels in neuronal populations in visual cortex, “search for stimuli that maximize the information be-
they show that context effects can be interpreted as tween system description and system observations,”
means to achieve decorrelation, and hence, efficient extends it to continuous stimuli, and applies it to the
coding. analysis of the auditory midbrain of zebra finches.
The use of information-theoretic tools to probe the In conclusion, Information Theory is thriving in
structure of network firing patterns has recently been the neuroscience community, and the long efforts are
the subject of great interest, sparked by work from bearing fruit, as diverse research questions are being
two laboratories (Shlens et al. 2006, 2009; Schneidman approached with more elaborate and refined tools.
et al. 2006; for a review see Nirenberg and Victor 2007) As demonstrated by this issue and the complemen-
which showed that retinal firing patterns are very well- tary compilation of the Information Theory Society
described by a pairwise maximum-entropy model. Not (Milenkovic et al. 2010), Information Theory is firmly
surprisingly (given its complex local circuitry), the pair- integrated in the fabric of neuroscience research, and
wise model fails in cortex (Ohiorhenuan et al. 2010); a progressively wider range of biological research in
the present contribution from Ohiorhenuan and Victor general, and will continue to play an important role in
(2011) shows that the deviations from the pairwise these disciplines. Conversely, neuroscience is starting
model are highly systematic, and how they can be to serve as a driver for further research in Information
characterized. Theory, opening interesting new directions of inquiry.
Several contributions also engage classic information-
theoretic tools from channel and source coding. Kim
et al. (2011) report for the first time that the responses
References
of olfactory sensory neurons in fruit flies expressing
the OR59b receptor is very precise and reproducible.
Abeles, M., & Lass, Y. (1975). Transmission of information by
The response of these neurons depends not only on the the axon: II. The channel capacity. Biological Cybernetics,
odorant concentration, but also on its rate of change. 19, 121–125.
The authors demonstrate that a two-dimensional en- Amblard, P.-O., & Michel, O. (2011). On directed information
theory and granger causality graphs. Journal of Computa-
coding manifold in a concentration-concentration gra- tional Neuroscience. doi:10.1007/s10827-010-0231-x.
dient space provides a quantitative description of the Atick, J. J., & Redlich, A. N. (1990). Towards a theory of early
neuron’s response. By defining a distance measure visual processing. Neural Computation, 2, 308–320.
4 J Comput Neurosci (2011) 30:1–5

Attneave, F. (1954). Some information aspects of visual percep- Lazar, A. A. (2004). Time encoding with an integrate-and-fire
tion. Psychological Review, 61, 183–193. neuron with a refractory period. Neurocomputing, 58–60,
Barlow, H. B. (1961). Possible princilples underlying the trans- 53–58.
formation of sensory messages. In W. A. Rosenblith (Ed.), Lazar, A. A. (2010). Population encoding with Hodgkin-Huxley
Sensory communications. MIT Press, Cambridge, MA. neurons. IEEE Transactions on Information Theory, Special
Carlton, A. G. (1969). On the bias of information estimates. Issue on Molecular Biology and Neuroscience, 56(2), 821–
Psychological Bulletin, 71, 108–109. 837.
de Ruyter van Steveninck, R., & Bialek, W. (1988). Real-time Lazar, A. A., & Pnevmatikakis, E. A. (2008). Faithful repre-
performance of a movement-sensitive neuron in the blowfly sentation of stimuli with a population of integrate-and-fire
visual system: Coding and information transfer in short spike neurons. Neural Computation, 20(11), 2715–2744.
sequences. Proceedings of the Royal Society Series B, Biolog- Lazar, A. A., Pnevmatikakis, E. A., & Zhou, Y. (2010). Encoding
ical Sciences, 234, 379–414. natural scenes with neural circuits with random thresholds.
Dimitrov, A. G., Cummins, G. I., Baker, A., & Aldworth, Z. N. Vision Research, Special Issue on Mathematical Models of
(2011). Characterizing the fine structure of a neural sensory Visual Coding, 50(22), 2200–2212.
code through information distortion. Journal of Computa- Lestienne, R., & Strehler, B. L. (1987). Time structure and stim-
tional Neuroscience. doi:10.1007/s10827-010-0261-4. ulus dependence of precisely replicating patterns present
Dimitrov, A. G., & Miller, J. P. (2001). Neural coding and decod- in monkey cortical neuronal spike trains. Brain Research,
ing: Communication channels and quantization. Network: 437(2), 214–238.
Computation in Neural Systems, 12, 441–472. Lestienne, R., & Strehler, B. (1988). Differences between mon-
Eagles, J. P., & Purple, R. L. (1974). Afferent fibers with multiple key visual cortex cells in triplet and ghost doublet infor-
encoding sites. Brain Research, 77(2), 187–193. mational symbols relationships. Biological Cybernetics, 59,
Eckhorn, R., Grüsser, O. J., Kröller, J., Pellnitz, K., & Pöpel, B. 337–352.
(1976). Efficiency of different neuronal codes: Information Lewi, J., Schneider, D., Woolley, S., & Paninski, L. (2011).
transfer calculations for three different neuronal systems. Automating the design of informative sequences of sen-
Biological Cybernetics, 22, 49–60. sory stimuli. Journal of Computational Neuroscience. doi:
Eckhorn, R., & Pöpel, B. (1974). Rigorous and extended appli- 10.1007/s10827-010-0248-1.
cation of information theory to the afferent visual system of Lizier, J., Heinzle, J., Horstmann, A., Haynes, J.-D., &
the cat. I. Basic conceptsfferent visual system of the cat. I. Prokopenko, M. (2011). Multivariate information-theoretic
Basic concepts. Biological Cybernetics, 16, 191–200. measures reveal directed information structure and task rel-
Eckhorn, R., & Pöpel, B. (1981). Responses of cat retinal gan- evant changes in fMRI connectivity. Journal of Computa-
glion cells to the random motion of a spot stimulus. Vision tional Neuroscience. doi:10.1007/s10827-010-0271-2.
Research, 21(4), 435–443. Machens, C. K. (2002). Adaptive sampling by information maxi-
Eckhorn, R., & Querfurth, H. (1985). Information transmission mization. Physical Review Letters, 88(22), 228104(4).
by isolated frog muscle spindle. Biological Cybernetics, 52, MacKay, D. M., & McCulloch, W. S. (1952). The limiting infor-
165–176. mation capacity of a neuronal link. Bulletin of Mathematical
Fuller, M. S., & Williams, W. J. (1983). A continuous informa- Biophysics, 14, 127–135.
tion theoretic approach to the analysis of cutaneous receptor Massey, J. (1990). Causality, feedback and directed informa-
neurons. Biological Cybernetics, 47, 13–16. tion. In Proc. Intl. Symp. Information Theory Applications
Gersho, A., & Gray, R. M. (1991). Vector Quantization and (ISITA-90) (pp. 303–305).
Signal Compression. Series in Engineering and Computer Massey, J., & Massey, P. (2005). Conservation of mutual and di-
Science. Kluwer International. rected information. In Proc. Intl. Symp. Information Theory
Gillespie, J. B., & Houghton, C. (2011). A metric space (ISIT 2005) (pp. 157–158).
approach to the information channel capacity of spike McCulloch, W. S. (1952). An upper bound on the informational
trains. Journal of Computational Neuroscience. doi:10.1007/ capacity of a synapse. In Proceedings of the 1952 ACM na-
s10827-010-0286-8. tional meeting, Pittsburgh, Pensilvania.
Granger, C. (1969). Investigating causal relations by economet- Merolla, P. A., & Boahen, K. (2006). Dynamic computation
ric models and cross-spectral methods. Econometrica, 37(3), in a recurrent network of heterogeneous silicon neurons.
424–438. In IEEE International Symposium on Circuits and Systems
Harvey, R. (1978). Patterns of output firing generated by a many- (pp. 4539–4542). IEEE Press.
input neuronal model for different model parameters and Milenkovic, O., Alterovitz, G., Battail, G., Coleman, T. P.,
patterns of synaptic drive. Brain Research, 150(2), 259–276. Hagenauer, J., Meyn, S. P., et al. (2010). Introduction to the
Kennel, M., Shlens, J., Abarbanel, H., & Chichilnisky, E. J. special issue on information theory in molecular biology and
(2005). Estimating entropy rates with bayesian confidence neuroscience. IEEE Transactions on Information Theory,
intervals. Neural Computation, 17, 1531–1576. 56(2), 649–652.
Kim, A. J., Lazar, A. A., & Slutskiy, Y. B. (2011). Sys- Miller, G. A. (1955). Note on the bias of information estimates. In
tem identification of drosophila olfactory sensory neurons. Information Theory in Psychology: Problems and Methods
Journal of Computational Neuroscience. doi:10.1007/s10827- (Vol. II-B, pp. 95–100). Free Press.
010-0265-0. Nemenman, I., Bialek, W., & de Ruyter van Steveninck, R.
Kjaer, T. W., Hertz, J. A., & Richmond, B. J. (1994). Decoding (2004). Entropy and information in neural spike trains:
cortical neuronal signals: Network models, information esti- Progress on the sampling problem. Physical Review Letters
mation and spatial tuning. Journal of Computational Neuro- E, 69, 056111.
science, 1(1–2), 109–139. Neymotin, S. A., Jacobs, K. M., Fenton, A. A., & Lytton, W. W.
Lass, Y., & Abeles, M. (1975). Transmission of information by (2011). Synaptic information transfer in computer models
the axon: I. Noise and memory in the myelinated nerve fiber of neocortical columns. Journal of Computational Neuro-
of the frog. Biological Cybernetics, 19, 61–67. science. doi:10.1007/s10827-010-0253-4.
J Comput Neurosci (2011) 30:1–5 5

Nirenberg, S., & Victor, J. D. (2007). Analyzing the activity of Stark, L., Negrete-Martinze, J., Yankelevich, G., & Theodoridis,
large populations of neurons—How tractable is the prob- G. (1969). Experiments on information coding in nerve im-
lem? Current Opinion in Neurobiology, 17, 397–400. pulse trains. Mathematical Biosciences, 4(3–4), 451–485.
Norwich, K. (1977). On the information received by sensory re- Stein, R. B. (1967). The information capacity of nerve cells using
ceptors. Bulletin of Mathematical Biology, 39, 453–461. a frequency code. Biophysical Journal, 7(6), 797–826.
Ohiorhenuan, I. E., Mechler, F., Purpura, K. P., Schmidt, A. M. Stein, R. B., French, A. S., & Holden, A. V. (1972). The fre-
& Victor, J. D. (2010). Sparse coding and high-order corre- quency response, coherence, and information capacity of
lations in fine-scale cortical networks. Nature, 466, 617–621. two neuronal models. Biophysical Journal, 12(3), 295–322.
Ohiorhenuan, I. E., & Victor, J. D. (2011). Information geomet- Strong, S. P., Koberle, R., de Ruyter van Steveninck, R. R., &
ric measure of 3-neuron firing patterns characterizes scale- Bialek, W. (1998). Entropy and information in neural spike
dependence in cortical networks. Journal of Computational trains. Physical Review Letters, 80(1), 197–200.
Neuroscience. doi:10.1007/s10827-010-0257-0. Surmeier, D. J., & Weinberg, R. J. (1985). The relationship be-
Optican, L. M., & Richmond, B. J. (1987). Temporal encoding of tween cross-correlation measures and underlying synaptic
two-dimensional patterns by single units in primate inferior events. Brain Research, 331(1), 180–184.
temporal cortex. III. Information theoretic analysis. Journal Taylor, R. C. (1975). Integration in the crayfish antennal neu-
of Neurophysiology, 57, 163–178. ropile: Topographic representation and multiple-channel
Paninski, L. (2003). Estimation of entropy and mutual informa- coding of mechanoreceptive submodalities. Developmental
tion. Neural Computation, 15, 1191–1253. Neurobiology, 6(5), 475–499.
Paninski, L. (2005). Asymptotic theory of information-theoretic Tishby, N., Pereira, F., & Bialek, W. (1999). The information
experimental design. Neural Computation, 17, 1480–1507. bottleneck method. In Proceedings of The 37th annual Aller-
Poussart, D. J. M. (1971). Membrane current noise in lobster ton conference on communication, control and computing.
axon under voltage clamp. Biophysical Journal, 11(2), 211– University of Illinios.
234. Treves. A., & Panzeri, S. (1995). The upward bias in measures
Quinn, C. J., Coleman, T. P., Kiyavash, N., & Hatsopoulos, of information derived from limited data samples. Neural
N. G. (2011). Estimating the directed information to infer Computation, 7, 399–440.
causal relationships in ensemble neural spike train record- Tsukada, M., Aihara, T., & Hauske, G. (1984). Redundancy re-
ings. Journal of Computational Neuroscience. doi:10.1007/ ducing processes in single neurons. Biological Cybernetics,
s10827-010-0247-2. 50, 157–165.
Rapoport, A., & Horvath, W. J. (1960). The theoretical channel Tzanakou, E., Michalak, R. & Harth, E. (1979). The alopex
capacity of a single neuron as determined by various coding process: Visual receptive fields by response feedback. Bio-
systems. Information and Control, 3(4), 335–350. logical Cybernetics, 35(161–174), 161–174.
Samengo, I. (2002). Information loss in an optimal maximum Vanni, S., & Rosenström, T. (2011). Local non-linear inter-
likelihood decoding. Neural Computation, 14, 771–779. actions in the visual cortex may reflect global decorrela-
Schneidman, E., Berry II, M. J., Segev, R., & Bialek, W. (2006). tion. Journal of Computational Neuroscience. doi:10.1007/
Weak pairwise correlations imply strongly correlated net- s10827-010-0239-2.
work states in a neural population. Nature, 440, 1007–1012. Vicente, R., Wibral, M., Lindner, M., & Pipa, G. (2011). Transfer
Schreiber, T. (2000). Measuring information transfer. Physical entropy—A model-free measure of effective connectivity for
Review Letters, 85(2), 461–464. the neurosciences. Journal of Computational Neuroscience.
Shannon, C. E. (1948). A mathematical theory of communication. doi:10.1007/s10827-010-0262-3.
Bell System Technical Journal, 27, 623–656. Victor, J. D. (2002). Binless strategies for estimation of informa-
Shannon, C. E. (1956). The bandwagon. IRE Transactions in tion from neural data. Physical Review E, 66, 051903.
Information Theory, 2(1), 3. Victor, J. D., & Johanessma, P. M. (1986). Maximum-entropy
Shannon, C. E., & McCarthy, J. (Eds.) (1956). Automata studies. approximations of stochastic nonlinear transductions: An ex-
In Annals of mathematics studies (Vol. 34). Princeton Uni- tension of the wiener theory. Biological Cybernetics, 54, 289–
versity Press. 300.
Shlens, J., Field, D. G., Gauthier, J. L., Greschner, M., Sher, A., Walloe, L. (1970). On the transmission of information through
Litke, A. M., et al. (2009). The structure of large-scale syn- sensory neurons. Biophysical Journal, 10(8), 745–763.
chronized firing in primate retina. Journal of Neuroscience, Werner, G., & Mountcastle, V. B. (1965). Neural activity in
29, 5022–5031. mechanoreceptive cutaneous afferents: Stimulus-response
Shlens, J., Field, D. G., Gauthier, L. J., Grivich, M. I., Petrusca, relations, weber functions, and information transmission.
D., Sher, A., et al. (2006). The structure of multi-neuron Journal of Neurophysiolgy, 28, 359–397.
firing patterns in primate retina. Journal of Neuroscience, 26, Winograd, S., & Cowan, J. D. (1963). Reliable computation in the
8254–8266. presence of noise. The MIT Press.

You might also like