Information Theory started and, according to some, ended with Shannon’s seminal paper “A Mathematical Theory of Communication” (Shannon 1948). Because its significance and flexibility were quickly recognized, there were numerous attempts to apply it to diverse fields outside of its original scope. This prompted Shannon to write his famous essay “The Bandwagon” (Shannon 1956), warning against indiscriminate use of the new tool. Nevertheless, non-standard applications of Information Theory persisted.
Very soon after Shannon’s initial publication (Shannon 1948), several manuscripts provided the foundations of much of the current use of information theory in neuroscience. MacKay and McCulloch (1952) applied the concept of information to propose limits of the transmission capacity of a nerve cell. This work foreshadowed future work on what can be termed “Neural Information Flow”—how much information moves through the nervous system, and the constraints that information theory imposes on the capabilities of neural systems for communication, computation and behavior. A second set of manuscripts, by Attneave (1954) and Barlow (1961), discussed information as a constraint on neural system structure and function, proposing that neural structure in sensory system is matched to statistical structure of the sensory environment, in a way to optimize information transmission. This is the main idea behind the “Structure from Information” line of research that is still very active today. A third thread, “Reliable Computation with Noisy/Faulty Elements”, started both in the information-theoretic community (Shannon and McCarthy 1956) and neuro-science (Winograd and Cowan 1963). With the advent of integrated circuits that were essentially faultless, interest began to wane. However, as IC technology continues to push towards smaller and faster computational elements (even at the expense of reliability), and as neuromorphic systems are developed with variability designed in (Merolla and Boahen 2006), this topic is gaining in popularity again in the electronics community, and neuroscientists again may have something to contribute to the discussion.
1 Subsequent developments
The theme that arguably has had the widest influence on the neuroscience community, and is most heavily represented in the current special issue of JCNS, is that of “Neural Information Flow”. The initial works of MacKay and McCulloch (1952), McCulloch (1952) and Rapoport and Horvath (1960) showed that neurons are in principle able to relay large quantities of information. This research lead to the first attempts to characterize the information flow in specific neural systems (Werner and Mountcastle 1965), and also started the first major controversy in the field, which still resonates today: the debate about timing versus frequency codes (Stein 1967; Stein et al. 1972). A steady stream of articles followed, both discussing these hypothesis and attempting to clarify the type of information relayed by nerve cells (Abeles and Lass 1975; Eagles and Purple 1974; Eckhorn and Pöpel 1974; Eckhorn et al. 1976; Harvey 1978; Lass and Abeles 1975; Norwich 1977; Poussart 1971; Stark et al. 1969; Taylor 1975; Walloe 1970).
After the initial rise in interest, the application of Information Theory to neuroscience was extended to a few more systems and questions (Eckhorn and Pöpel 1981; Eckhorn and Querfurth 1985; Fuller and Williams 1983; Kjaer et al. 1994; Lestienne and Strehler 1987, 1988; Optican and Richmond 1987; Surmeier and Weinberg 1985; Tsukuda et al. 1984; Victor and Johanessma 1986), but did not spread too broadly. This was presumably because, despite strong theoretical advances in Information Theory, its applicability was hampered by difficulty in measuring and interpreting information-theoretic quantities.
The work of de Ruyter van Steveninck and Bialek (1988) started what could be called the modern era of information-theoretic analysis in neuroscience, in which Information Theory is seeing more and more refined applications. Their work advanced the conceptual aspects of the application of information theory to neuroscience and, subsequently, provided a relatively straightforward way to estimate information-theoretic quantities (Strong et al. 1998). This work provided an approach to removing biases in information estimates due to finite sample size, but the scope of applicability of their approach was limited. The difficulties in obtaining unbiased estimates of information-theoretic quantities were noted early on by Carlton (1969) and Miller (1955) and brought again to attention by Treves and Panzeri (1995). However, it took the renewed interest generated by Strong et al. (1998) to spur the research that eventually resolved them. Almost simultaneously, several groups provided the neuroscience community with robust estimators valid under different conditions (Kennel et al. 2005; Nemenman et al. 2004; Paninski 2003; Victor 2002). This diversity proved important, as Paninski (2003) proved an inconsistency theorem showing that most common estimation techniques can encounter conditions that lead to arbitrary poor estimates.
At the same time, several other branches of Information Theory saw application in neuroscience context. The introduction of techniques stemming from work on quantization and lossy compression (Gersho and Gray 1991) provided lower bounds of information-theoretic quantities and ideas about inference based on them (Dimitrov and Miller 2001; Samengo 2002; Tishby et al. 1999). Furthermore, a large class of neuron models were characterized as samplers that, under appropriate conditions, faithfully encode sensory information in the spike train (time encoding machines, Lazar 2004). The class of neurons includes integrate-and-fire (Lazar and Pnevmatikakis 2008), threshold-and-fire (Lazar et al. 2010) and Hodgkin-Huxley neurons (Lazar 2010).
2 Current state
The work presented in the current issue builds on the developments in both information-theoretic and experimental techniques.
Several of the contributions apply a recent developments in Information Theory - directed information—to clarifying the structure of biological neural networks from observations of their activity. The works originate from the ideas of Granger (1969) on causal interactions, and was placed in an information-theoretic perspective by Massey (1990), Massey and Massey (2005) and Schreiber (2000). Amblard and Michel (2011) here merge the two ideas and extract Granger causality graphs by using directed information measures. They show that such tools are needed to analyze the structure of systems with feedback in general, and neural systems specifically. The authors also provide practical approximations with which to estimate these structures. Quinn et al. (2011) present a novel, nonlinear robust extension of the linear Granger tools and use it to infer the dynamics of neural ensembles based on physiological observations. In particular, the procedure uses point process models of neural spike trains, performs parameter and model order selection with minimal description length, and is applied to the analysis of interactions in neuronal assemblies in the primary motor cortex (MI) of macaque monkeys. Vicente et al. (2011) investigate transfer entropy (TE) as an alternative measure of effective connectivity to electrophysiological data based on simulations and magnetoencephalography (MEG) recordings in a simple motor task. The authors demonstrate that TE improved the detectability of effective connectivity for non-linear interactions, and for sensor level MEG signals where linear methods are hampered by signal-cross-talk due to volume conduction. Using neocortical column neuronal network simulations, Neymotin et al. (2011) demonstrate that networks with greater internal connectivity reduce input/output correlations from excitatory synapses and decrease negative correlations from inhibitory synapses. These changes were measured by normalized TE. Lizier et al. (2011) take the TE idea further and apply it to functional imaging data to determine the direction of information flow between brain regions. As a proof of principle, they show that this approach enables the identification of a hierarchical (tiered) network of cerebellar and cortical regions involved in motor tasks, and how the functional interactions of these regions are modulated by task difficulty.
As mentioned above, the “structure from information” theme is represented by a robust stream of investigations, centered on the notion that neural circuitry exploits the statistical features of the environment to enable efficient coding of sensory stimuli (Barlow 1961). The overwhelming majority of studies concern the implications of this principle for processing at a neuronal level (e.g., Atick and Redlich 1990). In this issue, however, Vanni and Rosenström (2011) consider a larger scale of organization. Through the use of functional imaging techniques to assess the distribution of activity levels in neuronal populations in visual cortex, they show that context effects can be interpreted as means to achieve decorrelation, and hence, efficient coding.
The use of information-theoretic tools to probe the structure of network firing patterns has recently been the subject of great interest, sparked by work from two laboratories (Shlens et al. 2006, 2009; Schneidman et al. 2006; for a review see Nirenberg and Victor 2007) which showed that retinal firing patterns are very well-described by a pairwise maximum-entropy model. Not surprisingly (given its complex local circuitry), the pairwise model fails in cortex (Ohiorhenuan et al. 2010); the present contribution from Ohiorhenuan and Victor (2011) shows that the deviations from the pairwise model are highly systematic, and how they can be characterized.
Several contributions also engage classic information-theoretic tools from channel and source coding. Kim et al. (2011) report for the first time that the responses of olfactory sensory neurons in fruit flies expressing the OR59b receptor is very precise and reproducible. The response of these neurons depends not only on the odorant concentration, but also on its rate of change. The authors demonstrate that a two-dimensional encoding manifold in a concentration-concentration gradient space provides a quantitative description of the neuron’s response. By defining a distance measure between spike trains, Gillespie and Houghton (2011) present a novel method for calculating the information channel capacity of spike trains channels. As an example, they calculate the capacity of a data set recorded from auditory neurons in zebra finch. Dimitrov et al. (2011) present a twofold extension of their Information Distortion method (Dimitrov and Miller 2001) as applied to the problem of neural coding. On the theoretical side, they develop the idea of joint quantization that provides optimal lossy compressions of the stimulus and response spaces simultaneously. The practical application to neural coding in the cricket cercal system introduces a family of estimators which provide greater flexibility of the method for different systems and experimental conditions.
And finally, Lewi et al. (2011) use Information Theory to approach the question “What are the best ways to study a biological system?”, when the goal is to characterize the system as efficiently as possible. Tzanakou et al. (1979) initially posed it in the form “Which is the ‘best’ stimulus for a sensory system?”. More recently, Machens (2002) rephrased the problem into “What is the best stimulus distribution that maximizes the information transmission in a sensory system?”. This paper takes the authors’ general formulation (Paninski 2005), “search for stimuli that maximize the information between system description and system observations,” extends it to continuous stimuli, and applies it to the analysis of the auditory midbrain of zebra finches.
In conclusion, Information Theory is thriving in the neuroscience community, and the long efforts are bearing fruit, as diverse research questions are being approached with more elaborate and refined tools. As demonstrated by this issue and the complementary compilation of the Information Theory Society (Milenkovic et al. 2010), Information Theory is firmly integrated in the fabric of neuroscience research, and a progressively wider range of biological research in general, and will continue to play an important role in these disciplines. Conversely, neuroscience is starting to serve as a driver for further research in Information Theory, opening interesting new directions of inquiry.
Contributor Information
Alexander G. Dimitrov, Email: alex.dimitrov@vancouver.wsu.edu, Department of Mathematics and Science Programs, Washington State University Vancouver, 14204 NE Salmon Creek Ave, Vancouver, WA 98686, USA
Aurel A. Lazar, Email: aurel@ee.columbia.edu, Department of Electrical Engineering, Columbia University, Mail Code 4712, 500 West 120th Street, New York, NY 10027, USA
Jonathan D. Victor, Email: jdvicto@med.cornell.edu, Division of Systems Neurology and Neuroscience, Department of Neurology and Neuroscience, Weill Cornell Medical College, 1300 York Avenue, New York, NY 10065, USA
References
- Abeles M, Lass Y. Transmission of information by the axon: II. The channel capacity. Biological Cybernetics. 1975;19:121–125. doi: 10.1007/BF00337250. [DOI] [PubMed] [Google Scholar]
- Amblard P-O, Michel O. On directed information theory and granger causality graphs. Journal of Computational Neuroscience. 2011 doi: 10.1007/s10827-010-0231-x. [DOI] [PubMed] [Google Scholar]
- Atick JJ, Redlich AN. Towards a theory of early visual processing. Neural Computation. 1990;2:308–320. [Google Scholar]
- Attneave F. Some information aspects of visual perception. Psychological Review. 1954;61:183–193. doi: 10.1037/h0054663. [DOI] [PubMed] [Google Scholar]
- Barlow HB. Possible princilples underlying the transformation of sensory messages. In: Rosenblith WA, editor. Sensory communications. MIT Press; Cambridge MA: 1961. [Google Scholar]
- Carlton AG. On the bias of information estimates. Psychological Bulletin. 1969;71:108–109. [Google Scholar]
- de Ruyter van Steveninck R, Bialek W. Real-time performance of a movement-sensitive neuron in the blowfly visual system: Coding and information transfer in short spike sequences. Proceedings of the Royal Society Series B, Biological Sciences. 1988;234:379–414. [Google Scholar]
- Dimitrov AG, Cummins GI, Baker A, Aldworth ZN. Characterizing the fine structure of a neural sensory code through information distortion. Journal of Computational Neuroscience. 2011 doi: 10.1007/s10827-010-0261-4. [DOI] [PubMed] [Google Scholar]
- Dimitrov AG, Miller JP. Neural coding and decoding: Communication channels and quantization. Network: Computation in Neural Systems. 2001;12:441–472. [PubMed] [Google Scholar]
- Eagles JP, Purple RL. Afferent fibers with multiple encoding sites. Brain Research. 1974;77(2):187–193. doi: 10.1016/0006-8993(74)90783-5. [DOI] [PubMed] [Google Scholar]
- Eckhorn R, Grüsser OJ, Kröller J, Pellnitz K, Pöpel B. Efficiency of different neuronal codes: Information transfer calculations for three different neuronal systems. Biological Cybernetics. 1976;22:49–60. doi: 10.1007/BF00340232. [DOI] [PubMed] [Google Scholar]
- Eckhorn R, Pöpel B. Rigorous and extended application of information theory to the afferent visual system of the cat. I. Basic conceptsfferent visual system of the cat. I. Basic concepts. Biological Cybernetics. 1974;16:191–200. doi: 10.1007/BF00288979. [DOI] [PubMed] [Google Scholar]
- Eckhorn R, Pöpel B. Responses of cat retinal ganglion cells to the random motion of a spot stimulus. Vision Research. 1981;21(4):435–443. doi: 10.1016/0042-6989(81)90090-0. [DOI] [PubMed] [Google Scholar]
- Eckhorn R, Querfurth H. Information transmission by isolated frog muscle spindle. Biological Cybernetics. 1985;52:165–176. doi: 10.1007/BF00339945. [DOI] [PubMed] [Google Scholar]
- Fuller MS, Williams WJ. A continuous information theoretic approach to the analysis of cutaneous receptor neurons. Biological Cybernetics. 1983;47:13–16. doi: 10.1007/BF00340064. [DOI] [PubMed] [Google Scholar]
- Gersho A, Gray RM. Vector Quantization and Signal Compression. Series in Engineering and Computer Science. Kluwer International; 1991. [Google Scholar]
- Gillespie JB, Houghton C. A metric space approach to the information channel capacity of spike trains. Journal of Computational Neuroscience. 2011 doi: 10.1007/s10827-010-0286-8. [DOI] [PubMed] [Google Scholar]
- Granger C. Investigating causal relations by econometric models and cross-spectral methods. Econometrica. 1969;37(3):424–438. [Google Scholar]
- Harvey R. Patterns of output firing generated by a many-input neuronal model for different model parameters and patterns of synaptic drive. Brain Research. 1978;150(2):259–276. doi: 10.1016/0006-8993(78)90279-2. [DOI] [PubMed] [Google Scholar]
- Kennel M, Shlens J, Abarbanel H, Chichilnisky EJ. Estimating entropy rates with bayesian confidence intervals. Neural Computation. 2005;17:1531–1576. doi: 10.1162/0899766053723050. [DOI] [PubMed] [Google Scholar]
- Kim AJ, Lazar AA, Slutskiy YB. System identification of drosophila olfactory sensory neurons. Journal of Computational Neuroscience. 2011 doi: 10.1007/s10827-010-0265-0. [DOI] [PMC free article] [PubMed] [Google Scholar]
- Kjaer TW, Hertz JA, Richmond BJ. Decoding cortical neuronal signals: Network models, information estimation and spatial tuning. Journal of Computational Neuro-science. 1994;1(1–2):109–139. doi: 10.1007/BF00962721. [DOI] [PubMed] [Google Scholar]
- Lass Y, Abeles M. Transmission of information by the axon: I. Noise and memory in the myelinated nerve fiber of the frog. Biological Cybernetics. 1975;19:61–67. doi: 10.1007/BF00364102. [DOI] [PubMed] [Google Scholar]
- Lazar AA. Time encoding with an integrate-and-fire neuron with a refractory period. Neurocomputing. 2004;58–60:53–58. [Google Scholar]
- Lazar AA. Population encoding with Hodgkin-Huxley neurons. IEEE Transactions on Information Theory, Special Issue on Molecular Biology and Neuroscience. 2010;56(2):821–837. doi: 10.1109/TIT.2009.2037040. [DOI] [PMC free article] [PubMed] [Google Scholar]
- Lazar AA, Pnevmatikakis EA. Faithful representation of stimuli with a population of integrate-and-fire neurons. Neural Computation. 2008;20(11):2715–2744. doi: 10.1162/neco.2008.06-07-559. [DOI] [PMC free article] [PubMed] [Google Scholar]
- Lazar AA, Pnevmatikakis EA, Zhou Y. Encoding natural scenes with neural circuits with random thresholds. Vision Research, Special Issue on Mathematical Models of Visual Coding. 2010;50(22):2200–2212. doi: 10.1016/j.visres.2010.03.015. [DOI] [PubMed] [Google Scholar]
- Lestienne R, Strehler BL. Time structure and stimulus dependence of precisely replicating patterns present in monkey cortical neuronal spike trains. Brain Research. 1987;437(2):214–238. doi: 10.1016/0006-8993(87)91638-6. [DOI] [PubMed] [Google Scholar]
- Lestienne R, Strehler B. Differences between monkey visual cortex cells in triplet and ghost doublet informational symbols relationships. Biological Cybernetics. 1988;59:337–352. doi: 10.1007/BF00332924. [DOI] [PubMed] [Google Scholar]
- Lewi J, Schneider D, Woolley S, Paninski L. Automating the design of informative sequences of sensory stimuli. Journal of Computational Neuroscience. 2011 doi: 10.1007/s10827-010-0248-1. [DOI] [PMC free article] [PubMed] [Google Scholar]
- Lizier J, Heinzle J, Horstmann A, Haynes J-D, Prokopenko M. Multivariate information-theoretic measures reveal directed information structure and task relevant changes in fMRI connectivity. Journal of Computational Neuroscience. 2011 doi: 10.1007/s10827-010-0271-2. [DOI] [PubMed] [Google Scholar]
- Machens CK. Adaptive sampling by information maximization. Physical Review Letters. 2002;88(22):228104(4). doi: 10.1103/PhysRevLett.88.228104. [DOI] [PubMed] [Google Scholar]
- MacKay DM, McCulloch WS. The limiting information capacity of a neuronal link. Bulletin of Mathematical Biophysics. 1952;14:127–135. [Google Scholar]
- Massey J. Causality, feedback and directed information. Proc Intl Symp Information Theory Applications (ISITA-90) 1990:303–305. [Google Scholar]
- Massey J, Massey P. Conservation of mutual and directed information. Proc Intl Symp Information Theory (ISIT 2005) 2005:157–158. [Google Scholar]
- McCulloch WS. An upper bound on the informational capacity of a synapse. Proceedings of the 1952 ACM national meeting; Pittsburgh, Pensilvania. 1952. [Google Scholar]
- Merolla PA, Boahen K. Dynamic computation in a recurrent network of heterogeneous silicon neurons. IEEE International Symposium on Circuits and Systems; IEEE Press; 2006. pp. 4539–4542. [Google Scholar]
- Milenkovic O, Alterovitz G, Battail G, Coleman TP, Hagenauer J, Meyn SP, et al. Introduction to the special issue on information theory in molecular biology and neuroscience. IEEE Transactions on Information Theory. 2010;56(2):649–652. [Google Scholar]
- Miller GA. Information Theory in Psychology: Problems and Methods. II-B. Free Press; 1955. Note on the bias of information estimates; pp. 95–100. [Google Scholar]
- Nemenman I, Bialek W, de Ruyter van Steveninck R. Entropy and information in neural spike trains: Progress on the sampling problem. Physical Review Letters E. 2004;69:056111. doi: 10.1103/PhysRevE.69.056111. [DOI] [PubMed] [Google Scholar]
- Neymotin SA, Jacobs KM, Fenton AA, Lytton WW. Synaptic information transfer in computer models of neocortical columns. Journal of Computational Neuroscience. 2011 doi: 10.1007/s10827-010-0253-4. [DOI] [PMC free article] [PubMed] [Google Scholar]
- Nirenberg S, Victor JD. Analyzing the activity of large populations of neurons—How tractable is the problem? Current Opinion in Neurobiology. 2007;17:397–400. doi: 10.1016/j.conb.2007.07.002. [DOI] [PMC free article] [PubMed] [Google Scholar]
- Norwich K. On the information received by sensory receptors. Bulletin of Mathematical Biology. 1977;39:453–461. doi: 10.1007/BF02462923. [DOI] [PubMed] [Google Scholar]
- Ohiorhenuan IE, Mechler F, Purpura KP, Schmidt AM, Victor JD. Sparse coding and high-order correlations in fine-scale cortical networks. Nature. 2010;466:617–621. doi: 10.1038/nature09178. [DOI] [PMC free article] [PubMed] [Google Scholar]
- Ohiorhenuan IE, Victor JD. Information geometric measure of 3-neuron firing patterns characterizes scale-dependence in cortical networks. Journal of Computational Neuroscience. 2011 doi: 10.1007/s10827-010-0257-0. [DOI] [PMC free article] [PubMed] [Google Scholar]
- Optican LM, Richmond BJ. Temporal encoding of two-dimensional patterns by single units in primate inferior temporal cortex. III. Information theoretic analysis. Journal of Neurophysiology. 1987;57:163–178. doi: 10.1152/jn.1987.57.1.162. [DOI] [PubMed] [Google Scholar]
- Paninski L. Estimation of entropy and mutual information. Neural Computation. 2003;15:1191–1253. [Google Scholar]
- Paninski L. Asymptotic theory of information-theoretic experimental design. Neural Computation. 2005;17:1480–1507. doi: 10.1162/0899766053723032. [DOI] [PubMed] [Google Scholar]
- Poussart DJM. Membrane current noise in lobster axon under voltage clamp. Biophysical Journal. 1971;11(2):211–234. doi: 10.1016/S0006-3495(71)86209-4. [DOI] [PMC free article] [PubMed] [Google Scholar]
- Quinn CJ, Coleman TP, Kiyavash N, Hatsopoulos NG. Estimating the directed information to infer causal relationships in ensemble neural spike train recordings. Journal of Computational Neuroscience. 2011 doi: 10.1007/s10827-010-0247-2. [DOI] [PMC free article] [PubMed] [Google Scholar]
- Rapoport A, Horvath WJ. The theoretical channel capacity of a single neuron as determined by various coding systems. Information and Control. 1960;3(4):335–350. [Google Scholar]
- Samengo I. Information loss in an optimal maximum likelihood decoding. Neural Computation. 2002;14:771–779. doi: 10.1162/089976602317318947. [DOI] [PubMed] [Google Scholar]
- Schneidman E, Berry MJ, II, Segev R, Bialek W. Weak pairwise correlations imply strongly correlated network states in a neural population. Nature. 2006;440:1007–1012. doi: 10.1038/nature04701. [DOI] [PMC free article] [PubMed] [Google Scholar]
- Schreiber T. Measuring information transfer. Physical Review Letters. 2000;85(2):461–464. doi: 10.1103/PhysRevLett.85.461. [DOI] [PubMed] [Google Scholar]
- Shannon CE. A mathematical theory of communication. Bell System Technical Journal. 1948;27:623–656. [Google Scholar]
- Shannon CE. The bandwagon. IRE Transactions in Information Theory. 1956;2(1):3. [Google Scholar]
- Shannon CE, McCarthy J, editors. Annals of mathematics studies. Vol. 34. Princeton University Press; 1956. Automata studies. [Google Scholar]
- Shlens J, Field DG, Gauthier JL, Greschner M, Sher A, Litke AM, et al. The structure of large-scale synchronized firing in primate retina. Journal of Neuroscience. 2009;29:5022–5031. doi: 10.1523/JNEUROSCI.5187-08.2009. [DOI] [PMC free article] [PubMed] [Google Scholar]
- Shlens J, Field DG, Gauthier LJ, Grivich MI, Petrusca D, Sher A, et al. The structure of multi-neuron firing patterns in primate retina. Journal of Neuroscience. 2006;26:8254–8266. doi: 10.1523/JNEUROSCI.1282-06.2006. [DOI] [PMC free article] [PubMed] [Google Scholar]
- Stark L, Negrete-Martinze J, Yankelevich G, Theodoridis G. Experiments on information coding in nerve impulse trains. Mathematical Biosciences. 1969;4(3–4):451–485. [Google Scholar]
- Stein RB. The information capacity of nerve cells using a frequency code. Biophysical Journal. 1967;7(6):797–826. doi: 10.1016/S0006-3495(67)86623-2. [DOI] [PMC free article] [PubMed] [Google Scholar]
- Stein RB, French AS, Holden AV. The frequency response, coherence, and information capacity of two neuronal models. Biophysical Journal. 1972;12(3):295–322. doi: 10.1016/S0006-3495(72)86087-9. [DOI] [PMC free article] [PubMed] [Google Scholar]
- Strong SP, Koberle R, de Ruyter van Steveninck RR, Bialek W. Entropy and information in neural spike trains. Physical Review Letters. 1998;80(1):197–200. [Google Scholar]
- Surmeier DJ, Weinberg RJ. The relationship between cross-correlation measures and underlying synaptic events. Brain Research. 1985;331(1):180–184. doi: 10.1016/0006-8993(85)90732-2. [DOI] [PubMed] [Google Scholar]
- Taylor RC. Integration in the crayfish antennal neuropile: Topographic representation and multiple-channel coding of mechanoreceptive submodalities. Developmental Neurobiology. 1975;6(5):475–499. doi: 10.1002/neu.480060505. [DOI] [PubMed] [Google Scholar]
- Tishby N, Pereira F, Bialek W. The information bottleneck method. Proceedings of The 37th annual Allerton conference on communication, control and computing; University of Illinios; 1999. [Google Scholar]
- Treves A, Panzeri S. The upward bias in measures of information derived from limited data samples. Neural Computation. 1995;7:399–440. [Google Scholar]
- Tsukada M, Aihara T, Hauske G. Redundancy reducing processes in single neurons. Biological Cybernetics. 1984;50:157–165. doi: 10.1007/BF00340023. [DOI] [PubMed] [Google Scholar]
- Tzanakou E, Michalak R, Harth E. The alopex process: Visual receptive fields by response feedback. Biological Cybernetics. 1979;35(161–174):161–174. doi: 10.1007/BF00337061. [DOI] [PubMed] [Google Scholar]
- Vanni S, Rosenström T. Local non-linear interactions in the visual cortex may reflect global decorrelation. Journal of Computational Neuroscience. 2011 doi: 10.1007/s10827-010-0239-2. [DOI] [PubMed] [Google Scholar]
- Vicente R, Wibral M, Lindner M, Pipa G. Transfer entropy—A model-free measure of effective connectivity for the neurosciences. Journal of Computational Neuroscience. 2011 doi: 10.1007/s10827-010-0262-3. [DOI] [PMC free article] [PubMed] [Google Scholar]
- Victor JD. Binless strategies for estimation of information from neural data. Physical Review E. 2002;66:051903. doi: 10.1103/PhysRevE.66.051903. [DOI] [PubMed] [Google Scholar]
- Victor JD, Johanessma PM. Maximum-entropy approximations of stochastic nonlinear transductions: An extension of the wiener theory. Biological Cybernetics. 1986;54:289–300. doi: 10.1007/BF00318425. [DOI] [PubMed] [Google Scholar]
- Walloe L. On the transmission of information through sensory neurons. Biophysical Journal. 1970;10(8):745–763. doi: 10.1016/S0006-3495(70)86333-0. [DOI] [PMC free article] [PubMed] [Google Scholar]
- Werner G, Mountcastle VB. Neural activity in mechanoreceptive cutaneous afferents: Stimulus-response relations, weber functions, and information transmission. Journal of Neurophysiolgy. 1965;28:359–397. doi: 10.1152/jn.1965.28.2.359. [DOI] [PubMed] [Google Scholar]
- Winograd S, Cowan JD. Reliable computation in the presence of noise. The MIT Press; 1963. [Google Scholar]