Skip to main content
Entropy logoLink to Entropy
editorial
. 2019 Jan 14;21(1):62. doi: 10.3390/e21010062

Information Theory in Neuroscience

Eugenio Piasini 1,*, Stefano Panzeri 2,*
PMCID: PMC7514168  PMID: 33266778

Abstract

This is the Editorial article summarizing the scope and contents of the Special Issue, Information Theory in Neuroscience.

Keywords: information theory, neuroscience


As the ultimate information processing device, the brain naturally lends itself to be studied with information theory. Because of this, information theory [1] has been applied to the study of the brain systematically for many decades and has been instrumental in many advances. It has spurred the development of principled theories of brain function [2,3,4,5,6,7,8]. It has led to advances in the study of consciousness [9]. It has also led to the development of many influential neural recording analysis techniques to crack the neural code, that is to unveil the language used by neurons to encode and process information [10,11,12,13,14,15].

The influence of information theory on the study of neural information processing continues today in many ways. In particular, concepts from information theory are beginning to be applied to the large-scale recordings of neural activity that can be obtained with techniques such as two-photon calcium imaging to understand the nature of the neural population code [16]. Advances in experimental techniques enabling precise recording and manipulation of neural activity on a large scale now enable for the first time the precise formulation and the quantitative test of hypotheses about how the brain encodes and transmits across areas the information used for specific functions, and information theory is a formalism that plays a useful role in the analysis and design of such experiments [17].

This Special Issue presents twelve original contributions on novel approaches in neuroscience using information theory, and on the development of new information theoretic results inspired by problems in neuroscience. The original contributions presented in this Special Issue span a wide range of topics.

Two papers use the concept of maximum entropy [18] to develop maximum entropy models to measure the existence of functional interactions between neurons and understand their potential role in neural information processing [19,20]. Kitazono et al. [21] and Bonmati et al. [22] develop concepts relating information theory to measures of complexity and integrated information. These techniques have potential for a wide range of applications, not least of which is the study of how consciousness emerges from the dynamics of the brain. Other work uses information theory as a tool to investigate different aspects of brain dynamics, from latching in neural networks [23], to the long-term development dynamics of the human brain studied using functional imaging data [24], to rapid information processing possibly mediated by the synfire chains [25] that have been reported in studies of simultaneously-recorded spike trains [26]. Other studies attempt to bridge between information theory and the theory of inference [27] and of categorical perception mediated by representation similarity in neural activity [28]. One paper [29] uses the recently-developed framework of partial information decomposition [30] to investigate the origins of synergy and redundancy in information representations, a topic of strong interest for the understanding of how neurons in the brain work together to represent information [31]. Finally, the two contributions of Samengo and colleagues examine applications of information theory to two specific problems of empirical importance in neuroscience: how to define how relevant specific response features are in a neural code [32], and what the code used by neurons in the temporal lobe to encode information is [33].

Acknowledgments

We are grateful to the contributing authors, to the anonymous referees, and to the Editorial Staff of Entropy for their excellent and tireless work, which made this Special Issue possible.

Author Contributions

E.P. and S.P. wrote the paper.

Conflicts of Interest

The authors declare no conflict of interest.

References

  • 1.Shannon C.E. A Mathematical Theory of Communication. Bell Syst. Tech. J. 1948;27:379–423. doi: 10.1002/j.1538-7305.1948.tb01338.x. [DOI] [Google Scholar]
  • 2.Srinivasan M.V., Laughlin S.B., Dubs A., Horridge G.A. Predictive coding: a fresh view of inhibition in the retina. Proc. R. Soc. Lond. Ser. B Biol. Sci. 1982;216:427–459. doi: 10.1098/rspb.1982.0085. [DOI] [PubMed] [Google Scholar]
  • 3.Atick J.J., Redlich A.N. Towards a Theory of Early Visual Processing. Neural Comput. 1990;2:308–320. doi: 10.1162/neco.1990.2.3.308. [DOI] [Google Scholar]
  • 4.Dong D.W., Atick J. Temporal decorrelation: A theory of lagged and nonlagged responses in the lateral geniculate nucleus Network. Netw. Comput. Neural Syst. 1995;6:159–178. doi: 10.1088/0954-898X_6_2_003. [DOI] [Google Scholar]
  • 5.Laughlin S.B., de Ruyter van Steveninck R.R., Anderson J.C. The metabolic cost of neural information. Nat. Neurosci. 1998;1:36–41. doi: 10.1038/236. [DOI] [PubMed] [Google Scholar]
  • 6.Hermundstad A.M., Briguglio J.J., Conte M.M., Victor J.D., Balasubramanian V., Tkačik G. Variance predicts salience in central sensory processing. eLife. 2014;3:e03722. doi: 10.7554/eLife.03722. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 7.Billings G., Piasini E., Lőrincz A., Nusser Z., Silver R.A. Network Structure within the Cerebellar Input Layer Enables Lossless Sparse Encoding. Neuron. 2014;83:960–974. doi: 10.1016/j.neuron.2014.07.020. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 8.Chalk M., Marre O., Tkačik G. Toward a unified theory of efficient, predictive, and sparse coding. Proc. Natl. Acad. Sci. USA. 2018;115:186–191. doi: 10.1073/pnas.1711114115. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 9.Tononi G., Sporns O., Edelman G.M. A measure for brain complexity: Relating functional segregation and integration in the nervous system. Proc. Natl. Acad. Sci. USA. 1994;91:5033–5037. doi: 10.1073/pnas.91.11.5033. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 10.Strong S.P., Koberle R., de Ruyter van Steveninck R.R., Bialek W. Entropy and information in neural spike trains. Phys. Rev. Lett. 1998;80:197–200. doi: 10.1103/PhysRevLett.80.197. [DOI] [Google Scholar]
  • 11.Borst A., Theunissen F.E. Information theory and neural coding. Nat. Neurosci. 1999;2:947–957. doi: 10.1038/14731. [DOI] [PubMed] [Google Scholar]
  • 12.Schneidman E., Berry M.J., Segev R., Bialek W. Weak pairwise correlations imply strongly correlated network states in a neural population. Nature. 2006;440:1007–1012. doi: 10.1038/nature04701. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 13.Quian Quiroga R., Panzeri S. Extracting information from neural populations: information theory and decoding approaches. Nat. Rev. Neurosci. 2009;10:173–185. doi: 10.1038/nrn2578. [DOI] [PubMed] [Google Scholar]
  • 14.Victor J.D. Approaches to information-theoretic analysis of neural activity. Biol. Theory. 2006;1:302–316. doi: 10.1162/biot.2006.1.3.302. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 15.Tkačik G., Marre O., Amodei D., Bialek W., Berry M.J. Searching for Collective Behavior in a Large Network of Sensory Neurons. PLoS Comput. Biol. 2014;10:e1003408. doi: 10.1371/journal.pcbi.1003408. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 16.Runyan C.A., Piasini E., Panzeri S., Harvey C.D. Distinct timescales of population coding across cortex. Nature. 2017;548:92–96. doi: 10.1038/nature23020. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 17.Panzeri S., Harvey C.D., Piasini E., Latham P.E., Fellin T. Cracking the Neural Code for Sensory Perception by Combining Statistics, Intervention, and Behavior. Neuron. 2017;93:491–507. doi: 10.1016/j.neuron.2016.12.036. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 18.Jaynes E.T. Information theory and statistical mechanics. Phys. Rev. 1957;106:620–630. doi: 10.1103/PhysRev.106.620. [DOI] [Google Scholar]
  • 19.Cofré R., Maldonado C. Information Entropy Production of Maximum Entropy Markov Chains from Spike Trains. Entropy. 2018;20:34. doi: 10.3390/e20010034. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 20.Cayco-Gajic N.A., Zylberberg J., Shea-Brown E. A Moment-Based Maximum Entropy Model for Fitting Higher-Order Interactions in Neural Data. Entropy. 2018;20:489. doi: 10.3390/e20070489. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 21.Kitazono J., Kanai R., Oizumi M. Efficient Algorithms for Searching the Minimum Information Partition in Integrated Information Theory. Entropy. 2018;20:173. doi: 10.3390/e20030173. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 22.Bonmati E., Bardera A., Feixas M., Boada I. Novel Brain Complexity Measures Based on Information Theory. Entropy. 2018;20:491. doi: 10.3390/e20070491. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 23.Kang C.J., Naim M., Boboeva V., Treves A. Life on the Edge: Latching Dynamics in a Potts Neural Network. Entropy. 2017;19:468. doi: 10.3390/e19090468. [DOI] [Google Scholar]
  • 24.Fan Y., Zeng L.L., Shen H., Qin J., Li F., Hu D. Lifespan Development of the Human Brain Revealed by Large-Scale Network Eigen-Entropy. Entropy. 2017;19:471. doi: 10.3390/e19090471. [DOI] [Google Scholar]
  • 25.Abeles M., Bergman H., Margalit E., Vaadia E. Spatiotemporal firing patterns in the frontal cortex of behaving monkeys. J. Neurophysiol. 1993;70:1629–1638. doi: 10.1152/jn.1993.70.4.1629. [DOI] [PubMed] [Google Scholar]
  • 26.Xiao Z., Wang B., Sornborger A.T., Tao L. Mutual Information and Information Gating in Synfire Chains. Entropy. 2018;20:102. doi: 10.3390/e20020102. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 27.Isomura T. A Measure of Information Available for Inference. Entropy. 2018;20:512. doi: 10.3390/e20070512. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 28.Brasselet R., Arleo A. Category Structure and Categorical Perception Jointly Explained by Similarity-Based Information Theory. Entropy. 2018;20:527. doi: 10.3390/e20070527. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 29.Chicharro D., Pica G., Panzeri S. The Identity of Information: How Deterministic Dependencies Constrain Information Synergy and Redundancy. Entropy. 2018;20:169. doi: 10.3390/e20030169. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 30.Williams P.L., Beer R.D. Nonnegative Decomposition of Multivariate Information. arXiv. 2010. 1004.2515
  • 31.Griffith V., Koch C. Quantifying Synergistic Mutual Information. In: Prokopenko M., editor. Guided Self-Organization: Inception. Springer; Berlin, Germany: 2014. pp. 159–190. [Google Scholar]
  • 32.Eyherabide H.G., Samengo I. Assessing the Relevance of Specific Response Features in the Neural Code. Entropy. 2018;20:879. doi: 10.3390/e20110879. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 33.Maidana Capitán M.B., Kropff E., Samengo I. Information-Theoretical Analysis of the Neural Code in the Rodent Temporal Lobe. Entropy. 2018;20:571. doi: 10.3390/e20080571. [DOI] [PMC free article] [PubMed] [Google Scholar]

Articles from Entropy are provided here courtesy of Multidisciplinary Digital Publishing Institute (MDPI)

RESOURCES