Skip to main content

Adaptive Information Filtering with Error Entropy and Error Correntropy Criteria

  • Chapter
  • First Online:

Part of the book series: Information Science and Statistics ((ISS))

Abstract

This chapter formulates a new cost function for adaptive filtering based on Renyi’s quadratic error entropy. The problem of estimating the linear system parameters \(\mathrm{\mathbf {w}} = {[{w}_{0},\ldots, {w}_{M-1}]}^{\mathrm{T}}\) in the setting of Figure 3.1 where x(n), and z(n) are random variables can be framed as model-based inference, because it relates measured data, uncertainty, and the functional description of the system and its parameters. The desired response z(n) can be thought of as being created by an unknown transformation of the input vector \(\mathrm{\mathbf {x}} = {[x(n),\ldots, x(n - M + 1)]}^{\mathrm{T}}\). Adaptive filtering theory [143, 284] addresses this problem using the MSE criterion applied to the error signal, \(e(n) = z(n) - f(\mathrm{\mathbf {w}},x(n))\)

${J}_{w}(e(n)) = E[{(z(n) - f(\mathrm{\mathbf{w}},x(n)))}^{2}]$
(3.1)

when the linear filter is a finite impulse response filter (FIR);

$y(n) =\sum\limits_{k=0}^{M-1}{w}_{ k}x(n - k).$
(3.2)

This is a preview of subscription content, log in via an institution.

Buying options

Chapter
USD   29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD   169.00
Price excludes VAT (USA)
  • Available as EPUB and PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD   219.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info
Hardcover Book
USD   219.99
Price excludes VAT (USA)
  • Durable hardcover edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Learn about institutional subscriptions

References

  1. Aczél J., Daróczy Z., On measures of information and their characterizations, Mathematics in Science and Engineering, vol. 115, Academic Press, New York, 1975.

    Google Scholar 

  2. Ahmad I., Lin P., A nonparametric estimation of the entropy for absolutely continuous distributions, IEEE Trans. on Inf. Theor., 22:372–375, 1976.

    Article  MATH  MathSciNet  Google Scholar 

  3. Al-Naffouri T., Zerguine A., Bettayeb M., A unifying view of error nonlinearities in LMS adaptation, in Proc. ICASSP, vol. III, Seattle, pp. 1697–1700, May 1998.

    Google Scholar 

  4. Amari S., Nagoata H., Methods of information geometry, Mathematical Monographs, vol. 191, American Mathematical Society, Providence RI, 2000.

    Google Scholar 

  5. Chen B., Hu J., Pu L., Sun Z., Stochastic gradient algorithm under (h, ϕ)-entropy criterion, Circuits Syst. Signal Process., 26:941–960, 2007.

    Article  MATH  MathSciNet  Google Scholar 

  6. Douglas S., Meng H., Stochastic gradient adaptation under general error criteria, IEEE Trans. Signal Process., 42:1335–1351, 1994.

    Article  Google Scholar 

  7. Edmonson W., Srinivasan K., Wang C., Principe J. A global least square algorithm for adaptive IIR filtering, IEEE Trans. Circuits Syst., 45(3):379–384, 1996.

    Article  Google Scholar 

  8. Erdogmus D., Principe J.C., An error-entropy minimization algorithm for supervised training of nonlinear adaptive systems, IEEE Trans. Signal Process., 50(7):1780–1786, 2002.

    Article  MathSciNet  Google Scholar 

  9. Erdogmus D., J. Principe, Generalized information potential for adaptive systems training, IEEE Trans. Neural Netw., 13(5):1035–1044, 2002.

    Article  Google Scholar 

  10. Fox J., An R and S Companion to Applied Regression, Sage, London, 2002.

    Google Scholar 

  11. Hampel, F. R., Ronchetti E. M., Rousseau P. J., Stahel W. A., Robust Statistics: The Approach Based on Influence Functions. Wiley, New York, 1985.

    Google Scholar 

  12. Hardle W., Applied Nonparametric Regression, Econometric Society Monographs vol 19, Cambridge University Press, New York, 1990.

    Google Scholar 

  13. Haykin S., Adaptive Filter Theory, 4th Edition, Prentice Hall, Englewood Cliffs, NJ, 2002.

    Google Scholar 

  14. Huber, P.J., Robust Estimation of a Location Parameter. Ann. Math. Statist., 35:73–101, 1964.

    Article  MATH  MathSciNet  Google Scholar 

  15. Jenssen R., Erdogmus D., Hild II K., Principe J., Eltoft T., Information cut for clustering using a gradient descent approach, Pattern Recogn., 40:796–806, 2006.

    Article  Google Scholar 

  16. Liu W., Pokharel P., Principe J., Error entropy, correntropy and M-estimation, IEEE Int. Workshop on Machine Learning for Signal Processing, 2006.

    Google Scholar 

  17. Liu W., Pokharel P., Principe J., Correntropy: Properties and applications in non Gaussian signal processing, IEEE Trans. Sig. Proc., 55(11):5286–5298, 2007.

    Article  MathSciNet  Google Scholar 

  18. Middleton D., Statistical-physical models of electromagnetic interference, IEEE Trans. Electromagn. Compat., EMC-19(3):106–126, Aug. 1977.

    Article  Google Scholar 

  19. Morejon R., An information theoretic approach to sonar automatic target recognition, Ph.D. dissertation, University of Florida, Spring 2003

    Google Scholar 

  20. Pei S., Tseng C., Least mean p-power error criterion for adaptive FIR filter, IEEE J. Selected Areas Commun., 12(9):1540–1547, 1994.

    Article  Google Scholar 

  21. Rubinstein R., Simulation and the Monte Carlo Method, John Wiley & Sons, New York, 1981.

    Book  MATH  Google Scholar 

  22. Sayed A., Fundamentals of Adaptive Filters, John Wiley & Son, New York, 2003

    Google Scholar 

  23. Sidak Z., Sen P., Hajek J., Theory of Rank Tests, Academic Press, London, 1999.

    MATH  Google Scholar 

  24. Singh A., Principe J., Using correntropy as a cost function in linear adaptive filters, Proc. IEEE IJCNN 2009, Atlanta, 2009.

    Google Scholar 

  25. Styblinski M., Tang T., Experiments in nonconvex optimization: Stochastic approximation with function smoothing and simulated annealing, Neural Netw., 3: 467–483, 1990.

    Article  Google Scholar 

  26. Tanrikuku O., Chambers J., Convergence and steady-state properties of the least-mean mixed norm (LMMN) adaptive algorithm, IEE Proc. -Vision, Image Signal Process., 143: 137–142, June 1996.

    Article  Google Scholar 

  27. Walach E., Widrow B., The least mean fourth (LMF) adaptive algorithm and its family, IEEE Trans. Inf. Theor., IT-30(2):275–283, 1984.

    Article  Google Scholar 

  28. Widrow B., S. Stearns, Adaptive Signal Processing, Prentice Hall, Englewood Cliffs, NJ, 1985.

    MATH  Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Rights and permissions

Reprints and permissions

Copyright information

© 2010 Springer Science+Business Media, LLC

About this chapter

Cite this chapter

Erdogmus, D., Liu, W. (2010). Adaptive Information Filtering with Error Entropy and Error Correntropy Criteria. In: Information Theoretic Learning. Information Science and Statistics. Springer, New York, NY. https://doi.org/10.1007/978-1-4419-1570-2_3

Download citation

  • DOI: https://doi.org/10.1007/978-1-4419-1570-2_3

  • Published:

  • Publisher Name: Springer, New York, NY

  • Print ISBN: 978-1-4419-1569-6

  • Online ISBN: 978-1-4419-1570-2

  • eBook Packages: Computer ScienceComputer Science (R0)

Publish with us

Policies and ethics