Skip to main content

Estimation of Differential Entropy for Positive Random Variables and Its Application in Computational Neuroscience

  • Chapter

Summary

We use the differential entropy concept and methods related to differential entropy estimation in this chapter. In the beginning, we define the basic terms: entropy, differential entropy, the Kullback–Leibler distance and the refractory periods. We show relations between differential entropy and the Kullback–Leibler distance as well.

Hereafter a detailed description of the methods used is given. These methods can be divided into three groups: parametric methods of entropy estimation, “plug-in” entropy estimators based on nonparametric density estimation and direct entropy estimators. The formulas for direct entropy estimation based on the first four sample moments are introduced.

The results are illustrated by comparing the methods of the entropy estimation, combined with two refractory period estimates. We compare the estimates based on the histogram, the kernel density estimator, the sample spacing method, Vasicek’s method, the nearest neighbor distance method and the methods based on sample moments.

This is a preview of subscription content, log in via an institution.

Buying options

Chapter
USD   29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD   84.99
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
Hardcover Book
USD   109.99
Price excludes VAT (USA)
  • Durable hardcover edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Learn about institutional subscriptions

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

References

  1. Beirlant, J., Dudewicz, E. J., Györfi, L., van der Meulen, E. C.: Nonparametric entropy estimation: an overview. Int. J. Math. Stat. Sci., 6, 17–39 (1997).

    MATH  MathSciNet  Google Scholar 

  2. Cover, T. M., Thomas, J. A.: Elements of Information Theory. John Wiley & Sons, New York (1991).

    MATH  Google Scholar 

  3. Johnson, N. L., Kotz, S.: Distributions in Statistics—Continuous Univariate Distributions— 1. John Wiley & Sons, New York (1970).

    MATH  Google Scholar 

  4. Kostal, L., Lansky, P.: Similarity of interspike interval distributions and information gain in stationary neuronal firing. Biol. Cybernet., 94(2), 157–167 (2006).

    Article  MATH  MathSciNet  Google Scholar 

  5. Reeke, G. N., Coop, A. D.: Estimating the temporal interval entropy of neuronal discharge. Neural Computation, Massachusetts Institute of Technology, 16, 941–970 (2004).

    Article  MATH  Google Scholar 

  6. Vasicek, O.: A test for normality based on sample entropy. J. Statist. Soc. B, 38, 54–59 (1976).

    MATH  MathSciNet  Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2008 Birkhäuser Boston

About this chapter

Cite this chapter

Hampel, D. (2008). Estimation of Differential Entropy for Positive Random Variables and Its Application in Computational Neuroscience. In: Deutsch, A., et al. Mathematical Modeling of Biological Systems, Volume II. Modeling and Simulation in Science, Engineering and Technology. Birkhäuser Boston. https://doi.org/10.1007/978-0-8176-4556-4_19

Download citation

Publish with us

Policies and ethics