Skip to main content

Hermite Polynomials and Measures of Non-gaussianity

  • Conference paper

Part of the book series: Lecture Notes in Computer Science ((LNTCS,volume 6792))

Abstract

We first review some rigorous properties of the Hermite polynomials, and demonstrate their usefulness in estimating probability distributions as series from data samples. We then proceed to explain how these series can be used to obtain precise and robust measures of non-Gaussianity. Our measures of non-Gaussianity detect all kinds of deviations from Gaussianity, and thus provide reliable objective functions for ICA. With a linear computational complexity with respect to the sample size, our method is also suitable for large data sets.

This is a preview of subscription content, log in via an institution.

Buying options

Chapter
USD   29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD   39.99
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD   54.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Learn about institutional subscriptions

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

References

  1. Jones, M.C., Sibson, R.: What is Projection Pursuit? J. of Royal Statistical Society, Ser. A 150, 1–36 (1987)

    Article  MathSciNet  Google Scholar 

  2. Comon, P.: Independent component analysis—a new concept? Signal Processing 36, 287–314 (1994)

    Article  Google Scholar 

  3. Blinnikov, S., Moessner, R.: Expansions for nearly Gaussian distributions. Astron. Astrophys. Suppl. Ser. 130, 193–205 (1998)

    Article  Google Scholar 

  4. Szego, G.: Orthogonal Polynomials. AMS, Providence (1939)

    MATH  Google Scholar 

  5. Duoandikoetxea, J.: Fourier Analysis. AMS, Providence (2001)

    MATH  Google Scholar 

  6. Hyvarinen, A.: New Approximations of Differential Entropy for Independent Component Analysis and Projection Pursuit. In: Advances in Neural Information Processing Systems, vol. 10, pp. 273–279 (1998)

    Google Scholar 

  7. Faivishevsky, L., Goldberger, J.: ICA based on a Smooth Estimation of the Differential Entropy. In: Advances in Neural Information Processing Systems, vol. 21 (2009)

    Google Scholar 

  8. Van Hulle, M.M.: Multivariate Edgeworth-based Entropy Estimation. Neural Computation 17(9), 1903–1910 (2005)

    Article  Google Scholar 

  9. Pal, D., Poczos, P., Szepesvari, C.: Estimation of Renyi Entropy and Mutual Information Based on Generalized Nearest-Neighbor Graphs. In: Advances in Neural Information Processing Systems (2010)

    Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2011 Springer-Verlag Berlin Heidelberg

About this paper

Cite this paper

Puuronen, J., Hyvärinen, A. (2011). Hermite Polynomials and Measures of Non-gaussianity. In: Honkela, T., Duch, W., Girolami, M., Kaski, S. (eds) Artificial Neural Networks and Machine Learning – ICANN 2011. ICANN 2011. Lecture Notes in Computer Science, vol 6792. Springer, Berlin, Heidelberg. https://doi.org/10.1007/978-3-642-21738-8_27

Download citation

  • DOI: https://doi.org/10.1007/978-3-642-21738-8_27

  • Publisher Name: Springer, Berlin, Heidelberg

  • Print ISBN: 978-3-642-21737-1

  • Online ISBN: 978-3-642-21738-8

  • eBook Packages: Computer ScienceComputer Science (R0)

Publish with us

Policies and ethics