Skip to main content

Correntropy for Random Variables: Properties and Applications in Statistical Inference

  • Chapter
  • First Online:

Part of the book series: Information Science and Statistics ((ISS))

Abstract

Similarity is a key concept to quantify temporal signals or static measurements. Similarity is difficult to define mathematically, however, one never really thinks too much about this difficulty and naturally translates similarity by correlation. This is one more example of how engrained second-order moment descriptors of the probability density function really are in scientific thinking. Successful engineering or pattern recognition solutions from these methodologies rely heavily on the Gaussianity and linearity assumptions, exactly for the same reasons discussed in Chapter 3.

This is a preview of subscription content, log in via an institution.

Buying options

Chapter
USD   29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD   169.00
Price excludes VAT (USA)
  • Available as EPUB and PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD   219.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info
Hardcover Book
USD   219.99
Price excludes VAT (USA)
  • Durable hardcover edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Learn about institutional subscriptions

References

  1. Aronszajn N., The theory of reproducing kernels and their applications, Cambridge Philos. Soc. Proc., vol. 39:133–153, 1943.

    Article  MathSciNet  Google Scholar 

  2. Bach F., Jordan M., Kernel independent component analysis, J. Mach. Learn. Res., 3:1–48, 2002.

    MathSciNet  Google Scholar 

  3. Cover T., Thomas J., Elements of Information Theory, Wiley, New York, 1991

    Book  MATH  Google Scholar 

  4. Erdogmus D., Agrawal R., Principe J., A mutual information extension to the matched filter, Signal Process., 85(5):927–935, May 2005.

    Article  MATH  Google Scholar 

  5. Fukumizu K., Gretton A., Sun X., Scholkopf B., Kernel measures of conditional dependence. In Platt, Koller, Singer, and Roweis Eds., Advances in Neural Information Processing Systems 20, pp. 489–496. MIT Press, Cambridge, MA, 2008.

    Google Scholar 

  6. Granger C., Maasoumi E., and Racine J., A dependence metric for possibly nonlinear processes. J. Time Series Anal., 25(5):649–669, 2004.

    Article  MATH  MathSciNet  Google Scholar 

  7. Gretton, A., Herbrich R., Smola A., Bousquet O., Schölkopf B., Kernel Methods for Measuring Independence,” J. Mach. Learn. Res., 6:2075–2129, 2005.

    MATH  MathSciNet  Google Scholar 

  8. Joe H., Relative entropy measures of multivariate dependence. J. Amer. Statist. Assoc., 84(405):157–164, 1989.

    Article  MATH  MathSciNet  Google Scholar 

  9. Liu W., Pokharel P., Principe J., Correntropy: Properties and applications in non Gaussian signal processing, IEEE Trans. Sig. Proc., 55(11):5286–5298, 2007.

    Article  MathSciNet  Google Scholar 

  10. Mari D., Kotz S., Correlation and Dependence. Imperial College Press, London, 2001.

    Book  MATH  Google Scholar 

  11. McCulloch J., Financial applications of stable distributions. In G. S. Madala and C.R. Rao (Eds.), Handbook of Statistics, vol. 14, pages 393–425. Elsevier, Amsterdam, 1996.

    Google Scholar 

  12. Mercer J., Functions of positive and negative type, and their connection with the theory of integral equations, Philosoph. Trans. Roy. Soc. Lond., 209:415–446, 1909.

    Article  MATH  Google Scholar 

  13. Micheas A. and Zografos K., Measuring stochastic dependence using ’-divergence. J. Multivar. Anal., 97:765–784, 2006.

    Article  MATH  MathSciNet  Google Scholar 

  14. Nikias C. Shao M., Signal Processing with Alpha-Stable Distributions and Applications. John Wiley and Sons, New York, 1995.

    Google Scholar 

  15. Parzen E., On the estimation of a probability density function and the mode, Ann. Math. Statist., 33:1065–1067, 1962.

    Article  MATH  MathSciNet  Google Scholar 

  16. Pokharel P., Liu W., Principe J., A low complexity robust detector in impulsive noise, Signal Process., 89(10):1902–1909, 2009.

    Article  MATH  Google Scholar 

  17. Prichard, D., Theiler, J. (1994). Generating surrogate data for time series with several simultaneously measured variables. Phys. Rev. Lett., 73(7):951–954.

    Article  Google Scholar 

  18. Rao M., Xu J., Seth S., Chen Y., Tagare M., Principe J., Correntropy dependence measure, submitted to IEEE Trans. Signal Processing.

    Google Scholar 

  19. Renyi A., On measures of dependence. Acta Mathematica Academiae Scientiarum Hungaricae, 10:441–451, 1959.

    Article  MATH  MathSciNet  Google Scholar 

  20. Santamaría I., Pokharel P., Principe J, Generalized correlation function: Definition, properties and application to blind equalization, IEEE Trans. Signal Process., 54(6):2187–2197, 2006.

    Article  Google Scholar 

  21. Schreiber, T., Schmitz, A. (2000). Surrogate time series. Physica D, 142:346–382.

    Article  MATH  MathSciNet  Google Scholar 

  22. Silverman B., Density Estimation for Statistics and Data Analysis, Chapman and Hall, London, 1986.

    Book  MATH  Google Scholar 

  23. Silvey S., On a measure of association. Ann Math Statist., 35(3): 1157–1166, 1964.

    Article  MATH  MathSciNet  Google Scholar 

  24. Xu J., Bakardjian H., Cichocki A., Principe J., A new nonlinear similarity measure for multichannel signals, Neural Netw. (invited paper), 21(2–3):222–231, 2008.

    Article  MATH  Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Rights and permissions

Reprints and permissions

Copyright information

© 2010 Springer Science+Business Media, LLC

About this chapter

Cite this chapter

Liu, W., Pokharel, P., Xu, J., Seth, S. (2010). Correntropy for Random Variables: Properties and Applications in Statistical Inference. In: Information Theoretic Learning. Information Science and Statistics. Springer, New York, NY. https://doi.org/10.1007/978-1-4419-1570-2_10

Download citation

  • DOI: https://doi.org/10.1007/978-1-4419-1570-2_10

  • Published:

  • Publisher Name: Springer, New York, NY

  • Print ISBN: 978-1-4419-1569-6

  • Online ISBN: 978-1-4419-1570-2

  • eBook Packages: Computer ScienceComputer Science (R0)

Publish with us

Policies and ethics