Skip to main content

M-Estimators and Half-Quadratic Minimization

  • Chapter
  • First Online:
Robust Recognition via Information Theoretic Learning

Part of the book series: SpringerBriefs in Computer Science ((BRIEFSCOMPUTER))

Abstract

In robust statistics, there are several types of robust estimators, including M-estimator (maximum likelihood type estimator), L-estimator (linear combinations of order statistics), R-estimator (estimator based on rank transformation) [77], RM estimator (repeated median) [141], and LMS estimator (estimator using the least median of squares) [133]. When information theoretic learning is applied to robust statistics, the Gaussian kernel in entropy plays a role of Welsch M-estimator and can be efficiently optimized by half-quadratic minimization. Hence, in this chapter, we introduce some basic concepts of M-estimation and half-quadratic minimization.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

eBook
USD 16.99
Price excludes VAT (USA)
  • Available as EPUB and PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 16.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

Notes

  1. 1.

    Note that for different types of Q(v, p), the dual potential functions \(\varphi (.)\) may be different.

References

  1. Allain, M., Idier, J., Goussard, Y.: On global and local convergence of half-quadratic algorithms. IEEE Transactions on Image Processing 15(5), 1030–1042 (2006)

    Article  Google Scholar 

  2. Angst, R., Zach, C., Pollefeys, M.: The generalized trace norm and its application to structure from motion problems. In: International Conference on Computer Vision, pp. 2502–2509 (2011)

    Google Scholar 

  3. Bioucas-Dias, J., Figueiredo, M.: A new twist: Two-step iterative shrinkage/thresholding algorithms for image restoration. IEEE Transactions on Image Processing 16(12), 2992–3004 (2007)

    Article  MathSciNet  Google Scholar 

  4. Blake, A., Zisserman, A.: Visual Reconstruction. MIT Press, Cambridge, MA (1987)

    Google Scholar 

  5. Boyd, S., Vandenberghe, L.: Convex optimization. Cambridge University Press (2004)

    Google Scholar 

  6. Cetin, M., Karl, W.C.: Feature-enhanced synthetic aperture radar image formation based on nonquadratic regularization. IEEE Transactions on Image Processing 10(4), 623–631 (2001)

    Article  MATH  Google Scholar 

  7. Champagnat, F., Idier, J.: A connection between half-quadratic criteria and EM algorithms. IEEE Signal Processing Letters 11(9), 709–712 (2004)

    Article  Google Scholar 

  8. Charbonnier, P., Blanc-Feraud, L., Aubert, G., Barlaud, M.: Deterministic edge-preserving regularization in computed imaging. IEEE Transactions on Image Processing 6(2), 298–311 (1997)

    Article  Google Scholar 

  9. Cheng, B., Yang, J., Yan, S., Fu, Y., Huang, T.S.: Learning with ℓ 1-graph for image analysis. IEEE Transactions on Image Processing 4, 858–866 (2010)

    Article  MathSciNet  Google Scholar 

  10. Cover, T., Thomas, J.: Elements of Information Theory, 2nd edition. New Jersey: John Wiley (2005)

    Book  Google Scholar 

  11. Donoho, D.L., Tsaig, Y.: Fast solution of l 1-norm minimization problems when the solution may be sparse. IEEE Transactions on Information Theory 54(11), 4789–4812 (2008)

    Article  MATH  MathSciNet  Google Scholar 

  12. Du, L., Li, X., Shen, Y.D.: Robust nonnegative matrix factorization via half-quadratic minimization. In: International Conference on Data Mining, pp. 201–210 (2012)

    Google Scholar 

  13. Geman, D., Yang, C.: Nonlinear image recovery with half-quadratic regularization. IEEE Transactions on Image Processing 4(7), 932–946 (1995)

    Article  Google Scholar 

  14. Golub, G., Loan, C.V.: Matrix computations. 3rd edition. Johns Hopkins, Baltimore (1996)

    MATH  Google Scholar 

  15. He, R., Hu, B.G., Yuan, X., Zheng, W.S.: Principal component analysis based on nonparametric maximum entropy. Neurocomputing 73, 1840–1952 (2010)

    Article  Google Scholar 

  16. He, X., Yan, S., Hu, Y., Niyogi, P., Zhang, H.J.: Face recognition using laplacianfaces. IEEE Transactions on Pattern Analysis and Machine Intelligence 27(3), 328–340 (2005)

    Article  Google Scholar 

  17. Ho, J., Yang, M.H., Lim, J., Lee, K.C., Kriegman, D.: Clustering appearances of objects under varying illumination conditions. In: Proceedings of IEEE Conference on Computer Vision and Pattern Recognition, vol. 1, pp. 11–18 (2003)

    Google Scholar 

  18. Hyvarinen, A.: Fast and robust fixed-point algorithms for independent component analysis. IEEE Transactions on Neural Networks 10, 626–634 (1999)

    Article  Google Scholar 

  19. Jenssen, R., D.Erdogmus, Principe, J., Eltoft, T.: Information theoretic angle-based spectral clustering: a theoretical analysis and an algorithm. In: International joint conference on neural networks, pp. 4904–4911 (2006)

    Google Scholar 

  20. Luenberger, D.: Optimization by vector space methods. Wiley (1969)

    Google Scholar 

  21. Meer, P., Stewart, C., Tyler, D.: Robust computer vision: An interdisciplinary challenge, guest editorial. Computer Vision and Image Understanding 78, 1–7 (2000)

    Article  Google Scholar 

  22. Naseem, I., Togneri, R., Bennamoun, M.: Linear regression for face recognition. IEEE Transactions on Pattern Analysis and Machine Intelligence 32(11), 2106–2112 (2010)

    Article  Google Scholar 

  23. Niu, G., Dai, B., Yamada, M., Sugiyama, M.: Information-theoretic semi-supervised metric learning via entropy regularization. In: International Conference on Machine Learning (2012)

    Google Scholar 

  24. Peng, H., Long, F., Ding, C.: Feature selection based on mutual information: criteria of max-dependency, max-relevance, and min-redundancy. IEEE Transactions on Pattern Analysis and Machine Intelligence 27(8), 1226–1238 (2005)

    Article  Google Scholar 

  25. Santamaria, I., Pokharel, P.P., Principe, J.C.: Generalized correlation function: Definition, properties, and application to blind equalization. IEEE Transactions on Signal Processing 54(6), 2187–2197 (2006)

    Article  Google Scholar 

  26. Seth, S., Principe, J.C.: Compressed signal reconstruction using the correntropy induced metric. In: Proceedings of IEEE Conference on Acoustics, Speech and Signal Processing, pp. 3845–3848 (2008)

    Google Scholar 

  27. Tao, D., Li, X., Wu, X., Maybank, S.: Tensor rank one discriminant analysis - a convergent method for discriminative multilinear subspace selection. Neurocomputing 71, 1866–1882 (2008)

    Article  Google Scholar 

  28. Wright, J., Yang, A.Y., Ganesh, A., Sastry, S.S., Ma, Y.: Robust face recognition via sparse representation. IEEE Transactions on Pattern Analysis and Machine Intelligence 31(2), 210–227 (2009)

    Article  Google Scholar 

  29. Wright, J., Yang, A.Y., Ganesh, A., Sastry, S.S., Ma, Y.: Robust face recognition via sparse representation. IEEE Transactions on Pattern Analysis and Machine Intelligence 31(2), 210–227 (2009)

    Article  Google Scholar 

  30. Yin, W., Osher, S., Goldfarb, D., Darbon, J.: Bregman iterative algorithms for ℓ 1-minimization with applications to compressed sensing. SIAM Journal on Imaging Sciences 1(1), 143–168 (2008)

    Article  MATH  MathSciNet  Google Scholar 

  31. Zhang, T.: Multi-stage convex relaxation for learning with sparse regularization. In: Proceedings of Neural Information Processing Systems, pp. 16–21 (2008)

    Google Scholar 

  32. Zhang, T.H., Tao, D.C., Li, X.L., Yang, J.: Patch alignment for dimensionality reduction. IEEE Trans. Knowl. Data Eng. 21(9), 1299–1313 (2009)

    Article  Google Scholar 

  33. Zhang, Z.: Parameter estimation techniques: A tutorial with application to conic fitting. Image and Vision Computing 15(1), 59–76 (1997)

    Article  Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Rights and permissions

Reprints and permissions

Copyright information

© 2014 The Author(s)

About this chapter

Cite this chapter

He, R., Hu, B., Yuan, X., Wang, L. (2014). M-Estimators and Half-Quadratic Minimization. In: Robust Recognition via Information Theoretic Learning. SpringerBriefs in Computer Science. Springer, Cham. https://doi.org/10.1007/978-3-319-07416-0_2

Download citation

  • DOI: https://doi.org/10.1007/978-3-319-07416-0_2

  • Published:

  • Publisher Name: Springer, Cham

  • Print ISBN: 978-3-319-07415-3

  • Online ISBN: 978-3-319-07416-0

  • eBook Packages: Computer ScienceComputer Science (R0)

Publish with us

Policies and ethics