Abstract
In robust statistics, there are several types of robust estimators, including M-estimator (maximum likelihood type estimator), L-estimator (linear combinations of order statistics), R-estimator (estimator based on rank transformation) [77], RM estimator (repeated median) [141], and LMS estimator (estimator using the least median of squares) [133]. When information theoretic learning is applied to robust statistics, the Gaussian kernel in entropy plays a role of Welsch M-estimator and can be efficiently optimized by half-quadratic minimization. Hence, in this chapter, we introduce some basic concepts of M-estimation and half-quadratic minimization.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
Notes
- 1.
Note that for different types of Q(v, p), the dual potential functions \(\varphi (.)\) may be different.
References
Allain, M., Idier, J., Goussard, Y.: On global and local convergence of half-quadratic algorithms. IEEE Transactions on Image Processing 15(5), 1030–1042 (2006)
Angst, R., Zach, C., Pollefeys, M.: The generalized trace norm and its application to structure from motion problems. In: International Conference on Computer Vision, pp. 2502–2509 (2011)
Bioucas-Dias, J., Figueiredo, M.: A new twist: Two-step iterative shrinkage/thresholding algorithms for image restoration. IEEE Transactions on Image Processing 16(12), 2992–3004 (2007)
Blake, A., Zisserman, A.: Visual Reconstruction. MIT Press, Cambridge, MA (1987)
Boyd, S., Vandenberghe, L.: Convex optimization. Cambridge University Press (2004)
Cetin, M., Karl, W.C.: Feature-enhanced synthetic aperture radar image formation based on nonquadratic regularization. IEEE Transactions on Image Processing 10(4), 623–631 (2001)
Champagnat, F., Idier, J.: A connection between half-quadratic criteria and EM algorithms. IEEE Signal Processing Letters 11(9), 709–712 (2004)
Charbonnier, P., Blanc-Feraud, L., Aubert, G., Barlaud, M.: Deterministic edge-preserving regularization in computed imaging. IEEE Transactions on Image Processing 6(2), 298–311 (1997)
Cheng, B., Yang, J., Yan, S., Fu, Y., Huang, T.S.: Learning with ℓ 1-graph for image analysis. IEEE Transactions on Image Processing 4, 858–866 (2010)
Cover, T., Thomas, J.: Elements of Information Theory, 2nd edition. New Jersey: John Wiley (2005)
Donoho, D.L., Tsaig, Y.: Fast solution of l 1-norm minimization problems when the solution may be sparse. IEEE Transactions on Information Theory 54(11), 4789–4812 (2008)
Du, L., Li, X., Shen, Y.D.: Robust nonnegative matrix factorization via half-quadratic minimization. In: International Conference on Data Mining, pp. 201–210 (2012)
Geman, D., Yang, C.: Nonlinear image recovery with half-quadratic regularization. IEEE Transactions on Image Processing 4(7), 932–946 (1995)
Golub, G., Loan, C.V.: Matrix computations. 3rd edition. Johns Hopkins, Baltimore (1996)
He, R., Hu, B.G., Yuan, X., Zheng, W.S.: Principal component analysis based on nonparametric maximum entropy. Neurocomputing 73, 1840–1952 (2010)
He, X., Yan, S., Hu, Y., Niyogi, P., Zhang, H.J.: Face recognition using laplacianfaces. IEEE Transactions on Pattern Analysis and Machine Intelligence 27(3), 328–340 (2005)
Ho, J., Yang, M.H., Lim, J., Lee, K.C., Kriegman, D.: Clustering appearances of objects under varying illumination conditions. In: Proceedings of IEEE Conference on Computer Vision and Pattern Recognition, vol. 1, pp. 11–18 (2003)
Hyvarinen, A.: Fast and robust fixed-point algorithms for independent component analysis. IEEE Transactions on Neural Networks 10, 626–634 (1999)
Jenssen, R., D.Erdogmus, Principe, J., Eltoft, T.: Information theoretic angle-based spectral clustering: a theoretical analysis and an algorithm. In: International joint conference on neural networks, pp. 4904–4911 (2006)
Luenberger, D.: Optimization by vector space methods. Wiley (1969)
Meer, P., Stewart, C., Tyler, D.: Robust computer vision: An interdisciplinary challenge, guest editorial. Computer Vision and Image Understanding 78, 1–7 (2000)
Naseem, I., Togneri, R., Bennamoun, M.: Linear regression for face recognition. IEEE Transactions on Pattern Analysis and Machine Intelligence 32(11), 2106–2112 (2010)
Niu, G., Dai, B., Yamada, M., Sugiyama, M.: Information-theoretic semi-supervised metric learning via entropy regularization. In: International Conference on Machine Learning (2012)
Peng, H., Long, F., Ding, C.: Feature selection based on mutual information: criteria of max-dependency, max-relevance, and min-redundancy. IEEE Transactions on Pattern Analysis and Machine Intelligence 27(8), 1226–1238 (2005)
Santamaria, I., Pokharel, P.P., Principe, J.C.: Generalized correlation function: Definition, properties, and application to blind equalization. IEEE Transactions on Signal Processing 54(6), 2187–2197 (2006)
Seth, S., Principe, J.C.: Compressed signal reconstruction using the correntropy induced metric. In: Proceedings of IEEE Conference on Acoustics, Speech and Signal Processing, pp. 3845–3848 (2008)
Tao, D., Li, X., Wu, X., Maybank, S.: Tensor rank one discriminant analysis - a convergent method for discriminative multilinear subspace selection. Neurocomputing 71, 1866–1882 (2008)
Wright, J., Yang, A.Y., Ganesh, A., Sastry, S.S., Ma, Y.: Robust face recognition via sparse representation. IEEE Transactions on Pattern Analysis and Machine Intelligence 31(2), 210–227 (2009)
Wright, J., Yang, A.Y., Ganesh, A., Sastry, S.S., Ma, Y.: Robust face recognition via sparse representation. IEEE Transactions on Pattern Analysis and Machine Intelligence 31(2), 210–227 (2009)
Yin, W., Osher, S., Goldfarb, D., Darbon, J.: Bregman iterative algorithms for ℓ 1-minimization with applications to compressed sensing. SIAM Journal on Imaging Sciences 1(1), 143–168 (2008)
Zhang, T.: Multi-stage convex relaxation for learning with sparse regularization. In: Proceedings of Neural Information Processing Systems, pp. 16–21 (2008)
Zhang, T.H., Tao, D.C., Li, X.L., Yang, J.: Patch alignment for dimensionality reduction. IEEE Trans. Knowl. Data Eng. 21(9), 1299–1313 (2009)
Zhang, Z.: Parameter estimation techniques: A tutorial with application to conic fitting. Image and Vision Computing 15(1), 59–76 (1997)
Author information
Authors and Affiliations
Rights and permissions
Copyright information
© 2014 The Author(s)
About this chapter
Cite this chapter
He, R., Hu, B., Yuan, X., Wang, L. (2014). M-Estimators and Half-Quadratic Minimization. In: Robust Recognition via Information Theoretic Learning. SpringerBriefs in Computer Science. Springer, Cham. https://doi.org/10.1007/978-3-319-07416-0_2
Download citation
DOI: https://doi.org/10.1007/978-3-319-07416-0_2
Published:
Publisher Name: Springer, Cham
Print ISBN: 978-3-319-07415-3
Online ISBN: 978-3-319-07416-0
eBook Packages: Computer ScienceComputer Science (R0)