Skip to main content

Summary of Information Theoretic Quantities

  • Living reference work entry
  • First Online:
Encyclopedia of Computational Neuroscience

Definition

Information theory is a practical and theoretic framework developed for the study of communication over noisy channels. Its probabilistic basis and capacity to relate statistical structure to function make it ideally suited for studying information flow in the nervous system. As a framework, it has a number of useful properties: it provides a general measure sensitive to any relationship, not only linear effects; its quantities have meaningful units which, in many cases, allow a direct comparison between different experiments; and it can be used to study how much information can be gained by observing neural responses in single experimental trials rather than in averages over multiple trials. A variety of information theoretic quantities are in common use in neuroscience – including the Shannon entropy, Kullback–Leibler divergence, and mutual information. In this entry, we introduce and define these quantities. Further details on how these quantities can be estimated in...

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Institutional subscriptions

References

  • Bell AJ (2003) The co-information lattice. In: 4th International Symposium on independent component analysis blind signal separation ICA2003, Nara, pp 921–926

    Google Scholar 

  • Chakrabarti CG, Chakrabarty I (2005) Shannon entropy: axiomatic characterization and application. Int J Math Math Sci 2005:2847–2854

    Article  Google Scholar 

  • Cover TM, Thomas JA (1991) Elements of information theory. Wiley, New York

    Book  Google Scholar 

  • Fuglede B, Topsoe F (2004) Jensen-Shannon divergence and Hilbert space embedding. In: Proceedings of the international symposium on information theory, 2004 (ISIT 2004), p 31, Chicago, USA

    Google Scholar 

  • Green DM, Swets JA (1966) Signal detection theory and psychophysics. Wiley, New York. http://andrei.gorea.free.fr/Teaching_fichiers/SDT%20and%20Psytchophysics.pdf. Accessed 17 Jan 2014

  • Ince RAA, Mazzoni A, Bartels A, Logothetis NK, Panzeri S (2012) A novel test to determine the significance of neural selectivity to single and multiple potentially correlated stimulus features. J Neurosci Methods 210:49–65

    Article  PubMed  Google Scholar 

  • Jakulin A, Bratko I (2003) Quantifying and visualizing attribute interactions. arXiv:cs/0308002. http://arxiv.org/abs/cs/0308002. Accessed 13 Feb 2012

  • Kennel MB, Shlens J, Abarbanel HDI, Chichilnisky E (2005) Estimating entropy rates with Bayesian confidence intervals. Neural Comput 17:1531–1576

    Article  PubMed  Google Scholar 

  • Kullback S, Leibler RA (1951) On Information and Sufficiency. Ann Math Stat 22:79–86

    Article  Google Scholar 

  • Lin J (1991) Divergence measures based on the Shannon entropy. IEEE Trans Inf Theory 37:145–151

    Article  Google Scholar 

  • McGill WJ (1954) Multivariate information transmission. Psychometrika 19:97–116

    Article  Google Scholar 

  • Schneidman E, Berry MJ II, Segev R, Bialek W (2006) Weak pairwise correlations imply strongly correlated network states in a neural population. Nature 440:1007–1012

    Article  CAS  PubMed Central  PubMed  Google Scholar 

  • Shannon CE (1948) A mathematical theory of communication. Bell Syst Tech J 27:379–423

    Article  Google Scholar 

  • Shannon CE, Weaver W (1949) The mathematical theory of communication. University of Illinois Press, Urbana, 19:1

    Google Scholar 

  • Shlens J, Kennel MB, Abarbanel HDI, Chichilnisky E (2007) Estimating information rates with confidence intervals in neural spike trains. Neural Comput 19:1683–1719

    Article  PubMed  Google Scholar 

  • Strong SP, Koberle R, de Ruyter van Steveninck RR, Bialek W (1998) Entropy and information in neural spike trains. Phys Rev Lett 80:197–200

    Article  CAS  Google Scholar 

Download references

Acknowledgements

Research supported by the SI-CODE (FET-Open, FP7-284533) project and by the ABC and NETT (People Programme Marie Curie Actions PITN-GA-2011-290011 and PITN-GA-2011-289146) projects of the European Union’s Seventh Framework Programme FP7 2007–2013.

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Robin A. A. Ince .

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2014 Springer Science+Business Media New York

About this entry

Cite this entry

Ince, R.A.A., Panzeri, S., Schultz, S.R. (2014). Summary of Information Theoretic Quantities. In: Jaeger, D., Jung, R. (eds) Encyclopedia of Computational Neuroscience. Springer, New York, NY. https://doi.org/10.1007/978-1-4614-7320-6_306-1

Download citation

  • DOI: https://doi.org/10.1007/978-1-4614-7320-6_306-1

  • Received:

  • Accepted:

  • Published:

  • Publisher Name: Springer, New York, NY

  • Online ISBN: 978-1-4614-7320-6

  • eBook Packages: Springer Reference Biomedicine and Life SciencesReference Module Biomedical and Life Sciences

Publish with us

Policies and ethics