Skip to main content

Part of the book series: Understanding Complex Systems ((UCS))

Abstract

Any measure of interdependence can lose much of its appeal due to a poor choice of its numerical estimator. Information theoretic functionals are particularly sensitive to this problem, especially when applied to noisy signals of only a few thousand data points or less. Unfortunately, this is a common scenario in applications to electrophysiology data sets. In this chapter, we will review the stateof- the-art estimators based on nearest-neighbor statistics for information transfer measures. Nearest neighbors techniques are more data-efficient than naive partition or histogram estimators and rely on milder assumptions than parametric approaches. However, they also come with limitations and several parameter choices that influence the numerical estimation of information theoretic functionals.We will describe step by step the efficient estimation of transfer entropy for a typical electrophysiology data set, and how the multi-trial structure of such data sets can be used to partially alleviate the problem of non-stationarity.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 84.99
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
Hardcover Book
USD 109.99
Price excludes VAT (USA)
  • Durable hardcover edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

References

  1. Hubel, D.H., Wiesel, T.N.: Receptive fields and functional architecture of monkey striate cortex. The Journal of Physiology 195(1), 215–243 (1968)

    Google Scholar 

  2. Gray, C.M., Knig, P., Engel, A.K., Singer, W.: Oscillatory responses in cat visual cortex exhibit inter-columnar synchronization which reflects global stimulus properties. Nature 338(6213), 334–337 (1989)

    Article  Google Scholar 

  3. Canolty, R.T., Knight, R.T.: The functional role of cross-frequency coupling. Trends in Cognitive Sciences 14(11), 506–515 (2010)

    Article  Google Scholar 

  4. Victor, J.D.: Approaches to information-theoretic analysis of neural activity. Biological Theory 1(3), 302 (2006)

    Article  Google Scholar 

  5. Lizier, J.T.: The Local Information Dynamics of Distributed Computation in Complex Systems. Springer theses. Springer (2013)

    Google Scholar 

  6. Lehmann, E.L., Casella, G.: Theory of point estimation, vol. 31. Springer (1998)

    Google Scholar 

  7. Niso, G., Brua, R., Pereda, E., Gutirrez, R., Bajo, R., Maest, F., Del-Pozo, F.: Hermes: Towards an integrated toolbox to characterize functional and effective brain connectivity. Neuroinformatics 11, 405–434 (2013)

    Article  Google Scholar 

  8. Pereda, E., Quiroga, R.Q., Bhattacharya, J.: Nonlinear multivariate analysis of neurophysiological signals. Progress in Neurobiology 77(1), 1–37 (2005)

    Google Scholar 

  9. Cover, T.M., Thomas, J.A.: Elements of information theory. Wiley-Interscience, New York (1991)

    Book  MATH  Google Scholar 

  10. Latham, P.E., Nirenberg, S.: Synergy, redundancy, and independence in population codes, revisited. J. Neurosci. 25(21), 5195–5206 (2005)

    Article  Google Scholar 

  11. Williams, P.L., Beer, R.D.: Nonnegative decomposition of multivariate information. arXiv preprint arXiv:1004.2515 (2010)

    Google Scholar 

  12. Rieke, F., Warland, D., Deruytervansteveninck, R., Bialek, W.: Spikes: exploring the neural code (computational neuroscience). MIT Press (1999)

    Google Scholar 

  13. Shannon, C.E.: The bell technical journal. A Mathematical Theory of Communication 27(4), 379–423 (1948)

    MATH  MathSciNet  Google Scholar 

  14. Shannon, C.E., Weaver, W.: The mathematical theory of communication, urbana, il, vol. 19(7), p. 1. University of Illinois Press (1949)

    Google Scholar 

  15. Barlow, H.B.: Possible principles underlying the transformation of sensory messages. Sensory Communication, 217–234 (1961)

    Google Scholar 

  16. de Ruyter van Steveninck, R.R., Laughlin, S.B.: The rate of information transfer at graded-potential synapses. Nature 379(6566), 642–645 (1996)

    Article  Google Scholar 

  17. Lewicki, M.S.: Efficient coding of natural sounds. Nature Neuroscience 5(4), 356–363 (2002)

    Article  Google Scholar 

  18. Olshausen, B.A., Field, D.J.: Sparse coding of sensory inputs. Current Opinion in Neurobiology 14(4), 481–487 (2004)

    Article  Google Scholar 

  19. Johnson, D.H.: Information theory and neuroscience: Why is the intersection so small? In: IEEE Information Theory Workshop, ITW 2008, pp. 104–108 (2008)

    Google Scholar 

  20. Shannon, C.E.: The bandwagon. IRE Transactions on Information Theory 2(1), 3 (1956)

    Article  Google Scholar 

  21. Nirenberg, S.H., Victor, J.D.: Analyzing the activity of large populations of neurons: how tractable is the problem? Current Opinion in Neurobiology 17(4), 397–400 (2007)

    Article  Google Scholar 

  22. Johnson, D.H.: Information theory and neural information processing. IEEE Transactions on Information Theory 56(2), 653–666 (2010)

    Article  Google Scholar 

  23. Schreiber, T.: Measuring information transfer. Phys. Rev. Lett. 85(2), 461–464 (2000)

    Article  Google Scholar 

  24. Wiener, N.: The theory of prediction. In: Beckmann, E.F. (ed.) Modern Mathematics for the Engineer. McGraw-Hill, New York (1956)

    Google Scholar 

  25. Vicente, R., Wibral, M., Lindner, M., Pipa, G.: Transfer entropy – a model-free measure of effective connectivity for the neurosciences. J. Comput. Neurosci. 30(1), 45–67 (2011)

    Article  MathSciNet  Google Scholar 

  26. Ay, N., Polani, D.: Information flows in causal networks. Adv. Complex Syst. 11, 17 (2008)

    Article  MATH  MathSciNet  Google Scholar 

  27. Kaiser, A., Schreiber, T.: Information transfer in continuous processes. Physica D 166, 43 (2002)

    Article  MATH  MathSciNet  Google Scholar 

  28. Chicharro, D., Ledberg, A.: When two become one: the limits of causality analysis of brain dynamics. PLoS One 7(3), e32466 (2012)

    Google Scholar 

  29. Chávez, M., Martinerie, J., Le Van Quyen, M.: Statistical assessment of nonlinear causality: application to epileptic EEG signals. J. Neurosci. Methods 124(2), 113–128 (2003)

    Article  Google Scholar 

  30. Wibral, M., Rahm, B., Rieder, M., Lindner, M., Vicente, R., Kaiser, J.: Transfer entropy in magnetoencephalographic data: Quantifying information flow in cortical and cerebellar networks. Prog. Biophys. Mol. Biol. 105(1-2), 80–97 (2011)

    Article  Google Scholar 

  31. Vicente, R., Gollo, L.L., Mirasso, C.R., Fischer, I., Pipa, G.: Dynamical relaying can yield zero time lag neuronal synchrony despite long conduction delays. Proceedings of the National Academy of Sciences 105(44), 17157–17162 (2008)

    Article  Google Scholar 

  32. Kay, S.M.: Fundamentals of statistical signal processing. In: Estimation Theory, vol. 1 (1993)

    Google Scholar 

  33. Hlaváčková-Schindler, K., Paluš, M., Vejmelka, M., Bhattacharya, J.: Causality detection based on information-theoretic approaches in time series analysis. Physics Reports 441(1), 1–46 (2007)

    Article  Google Scholar 

  34. Gourevitch, B., Eggermont, J.J.: Evaluating information transfer between auditory cortical neurons. J. Neurophysiol. 97(3), 2533–2543 (2007)

    Article  Google Scholar 

  35. Ito, S., Hansen, M.E., Heiland, R., Lumsdaine, A., Litke, A.M., Beggs, J.M.: Extending transfer entropy improves identification of effective connectivity in a spiking cortical network model. PLoS One 6(11), e27431 (2011)

    Google Scholar 

  36. Li, Z., Li, X.: Estimating temporal causal interaction between spike trains with permutation and transfer entropy. PLoS One 8(8), e70894 (2013)

    Google Scholar 

  37. Barnett, L., Barrett, A.B., Seth, A.K.: Granger causality and transfer entropy are equivalent for Gaussian variables. Phys. Rev. Lett. 103(23), 238701 (2009)

    Article  Google Scholar 

  38. Hlavácková-Schindler, K.: Equivalence of Granger causality and transfer entropy: A generalization. Applied Mathematical Sciences 5(73), 3637–3648 (2011)

    MATH  MathSciNet  Google Scholar 

  39. Nichols, J.M., Seaver, M., Trickey, S.T., Todd, M.D., Olson, C., Overbey, L.: Detecting nonlinearity in structural systems using the transfer entropy. Phys. Rev. E Stat. Nonlin. Soft Matter Phys. 72(4 Pt. 2), 046217 (2005)

    Google Scholar 

  40. Hahs, D.W., Pethel, S.D.: Transfer entropy for coupled autoregressive processes. Entropy 15(3), 767–788 (2013)

    Article  MathSciNet  Google Scholar 

  41. Barnett, L., Bossomaier, T.: Transfer entropy as a log-likelihood ratio. Physical Review Letters 109(13), 138105 (2012)

    Article  Google Scholar 

  42. Miller, G.A.: Note on the bias of information estimates. Information Theory in Psychology: Problems and Methods 2, 95–100 (1955)

    Google Scholar 

  43. Efron, B., Stein, C.: The jackknife estimate of variance. The Annals of Statistics, 586–596 (1981)

    Google Scholar 

  44. Pompe, B., Runge, J.: Momentary information transfer as a coupling measure of time series. Phys. Rev. E Stat. Nonlin. Soft Matter Phys. 83(5 Pt. 1), 051122 (2011)

    Google Scholar 

  45. Wibral, M., Pampu, N., Priesemann, V., Siebenhhner, F., Seiwert, H., Lindner, M., Lizier, J.T., Vicente, R.: Measuring information-transfer delays. PLoS One 8(2), e55809 (2013)

    Google Scholar 

  46. Paluš, M.: Testing for nonlinearity using redundancies: Quantitative and qualitative aspects. Physica D: Nonlinear Phenomena 80(1), 186–205 (1995)

    MATH  Google Scholar 

  47. Fraser, A.M., Swinney, H.L.: Independent coordinates for strange attractors from mutual information. Phys. Rev. A 33, 1134 (1986)

    Article  MATH  MathSciNet  Google Scholar 

  48. Darbellay, G.A., Vajda, I.: Estimation of the information by an adaptive partitioning of the observation space. IEEE Transactions on Information Theory 45(4), 1315–1321 (1999)

    Article  MATH  MathSciNet  Google Scholar 

  49. Cellucci, C.J., Albano, A.M., Rapp, P.E.: Statistical validation of mutual information calculations: Comparison of alternative numerical algorithms. Physical Review E 71(6), 066208 (2005)

    Google Scholar 

  50. Daub, C.O., Steuer, R., Selbig, J., Kloska, S.: Estimating mutual information using b-spline functions–an improved similarity measure for analysing gene expression data. BMC Bioinformatics 5(1), 118 (2004)

    Article  Google Scholar 

  51. Victor, J.: Binless strategies for estimation of information from neural data. Phys. Rev. E 72, 051903 (2005)

    Google Scholar 

  52. Silverman, B.W.: Density estimation for statistics and data analysis, vol. 26. CRC Press (1986)

    Google Scholar 

  53. Young-Il, M., Rajagopalan, B., Lall, U.: Estimation of mutual information using kernel density estimators. Physical Review E 52(3), 2318 (1995)

    Article  Google Scholar 

  54. Steuer, R., Kurths, J., Daub, C.O., Weise, J., Selbig, J.: The mutual information: detecting and evaluating dependencies between variables. Bioinformatics 18(suppl. 2), S231–S240 (2002)

    Google Scholar 

  55. Kozachenko, L.F., Leonenko, N.N.: Sample estimate of entropy of a random vector. Probl. Inform. Transm. 23, 95–100 (1987)

    MATH  MathSciNet  Google Scholar 

  56. Knuth, D.E.: The art of computer programming. In: Sorting and Searching, vol. 3 (1973)

    Google Scholar 

  57. Vaidya, P.M.: An O(n logn) algorithm for the all-nearest-neighbors problem. Discrete & Computational Geometry 4(1), 101–115 (1989)

    Article  MATH  MathSciNet  Google Scholar 

  58. Zezula, P., Amato, G., Dohnal, V., Batko, M.: Similarity search: The metric space approach. Advances in Database Systems, vol. 32. Springer, Secaucus (2005)

    Google Scholar 

  59. Heineman, G.T., Pollice, G., Selkow, S.: Algorithms in a Nutshell. O’Reilly Media, Inc. (2009)

    Google Scholar 

  60. Merkwirth, P., Lauterborn, W.: Fast nearest-neighbor searching for nonlinear signal processing. Phys. Rev. E Stat. Phys. Plasmas Fluids Relat. Interdiscip. Topics 62(2 Pt. A), 2089–2097 (2000)

    Google Scholar 

  61. Wollstadt, P., Martinez-Zarzuela, M., Vicente, R., Wibral, M.: Efficient transfer entropy analysis of nonstationary neural time series. arXiv preprint arXiv:1401.4068 (2014)

    Google Scholar 

  62. Kraskov, A., Stoegbauer, H., Grassberger, P.: Estimating mutual information. Phys. Rev. E Stat. Nonlin. Soft Matter Phys. 69(6 Pt. 2), 066138 (2004)

    Google Scholar 

  63. Kraskov, A.: Synchronization and Interdependence measures and their application to the electroencephalogram of epilepsy patients and clustering of data. PhD thesis, University of Wuppertal (February 2004)

    Google Scholar 

  64. Gomez-Herrero, G., Wu, W., Rutanen, K., Soriano, M.C., Pipa, G., Vicente, R.: Assessing coupling dynamics from an ensemble of time series. arXiv preprint arXiv:1008.0539 (2010)

    Google Scholar 

  65. Takens, F.: Detecting Strange Attractors in Turbulence. In: Dynamical Systems and Turbulence, Warwick, 1980. Lecture Notes in Mathematics, vol. 898, pp. 366–381. Springer (1981)

    Google Scholar 

  66. Kantz, H., Schreiber, T.: Nonlinear Time Series Analysis, 2nd edn. Cambridge University Press (November 2003)

    Google Scholar 

  67. Cao, L.Y.: Practical method for determining the minimum embedding dimension of a scalar time series. Physica A 110, 43–50 (1997)

    MATH  Google Scholar 

  68. Ragwitz, M., Kantz, H.: Markov models from data by simple nonlinear time series predictors in delay embedding spaces. Phys. Rev. E Stat. Nonlin. Soft Matter Phys. 65(5 Pt. 2), 056201 (2002)

    Google Scholar 

  69. Theiler, J.: Spurious dimension from correlation algorithms applied to limited time-series data. Physical Review A 34(3), 2427 (1986)

    Article  Google Scholar 

  70. Vejmelka, M., Hlaváčková-Schindler, K.: Mutual information estimation in higher dimensions: A speed-up of a k-nearest neighbor based estimator. In: Beliczynski, B., Dzielinski, A., Iwanowski, M., Ribeiro, B. (eds.) ICANNGA 2007, Part I. LNCS, vol. 4431, pp. 790–797. Springer, Heidelberg (2007)

    Chapter  Google Scholar 

  71. Lindner, M., Vicente, R., Priesemann, V., Wibral, M.: Trentool: A Matlab open source toolbox to analyse information flow in time series data with transfer entropy. BMC Neurosci. 12(119), 1–22 (2011)

    Google Scholar 

  72. Lindner, M., Vicente, R., Wibral, M., Pampu, N., Wollstadt, P., Martinez-Zarzuela, M.: TRENTOOL, http://www.trentool.de

  73. Rutanen, K.: TIM 1.2.0, http://www.cs.tut.fi/~timhome/tim-1.2.0/tim.htm

  74. Lizier, J.: Java Information Dynamics Toolkit, http://code.google.com/p/information-dynamics-toolkit/

  75. Faes, L., Nollo, G., Porta, A.: Non-uniform multivariate embedding to assess the information transfer in cardiovascular and cardiorespiratory variability series. Comput. Biol. Med. 42(3), 290–297 (2012)

    Article  Google Scholar 

  76. Lizier, J.T., Rubinov, M.: Inferring effective computational connectivity using incrementally conditioned multivariate transfer entropy. BMC Neuroscience 14(suppl. 1), P337 (2013)

    Google Scholar 

  77. Wiener, N.: Cybernetics. Hermann, Paris (1948)

    Google Scholar 

  78. Davies, P.C.W., Gregersen, N.H.: Information and the Nature of Reality, vol. 3. Cambridge University Press, Cambridge (2010)

    Book  Google Scholar 

  79. Barnett, L., Lizier, J.T., Harré, M., Seth, A.K., Bossomaier, T.: Information flow in a kinetic Ising model peaks in the disordered phase. Physical Review Letters 111(17), 177203 (2013)

    Article  Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Raul Vicente .

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2014 Springer-Verlag Berlin Heidelberg

About this chapter

Cite this chapter

Vicente, R., Wibral, M. (2014). Efficient Estimation of Information Transfer. In: Wibral, M., Vicente, R., Lizier, J. (eds) Directed Information Measures in Neuroscience. Understanding Complex Systems. Springer, Berlin, Heidelberg. https://doi.org/10.1007/978-3-642-54474-3_2

Download citation

  • DOI: https://doi.org/10.1007/978-3-642-54474-3_2

  • Publisher Name: Springer, Berlin, Heidelberg

  • Print ISBN: 978-3-642-54473-6

  • Online ISBN: 978-3-642-54474-3

  • eBook Packages: EngineeringEngineering (R0)

Publish with us

Policies and ethics