Abstract
Robust data classification or representation is a fundamental task and has a long history in computer vision. The algorithmic robustness, which is derived from the statistical definition of a breakdown point [49, 106], is the ability of an algorithm that tolerates a large amount of outliers. Therefore, a robust method should be effective enough to reject outliers in images and perform classification only on uncorrupted pixels. In the past decades, many works for subspace learning [37, 91] and sparse signal representation [101, 154] have been developed to obtain more robust image-based object recognition. Despite significant improvement, performing robust classification is still challenging due to the nature of unpredictable outliers in an image. Outliers may occupy any parts of an image and have arbitrarily large values in magnitude [155].
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
References
Daubechies, I., Defrise, M., De Mol, C.: An iterative thresholding algorithm for linear inverse problems with a sparsity constraint. Communications in Pure and Applied Mathematics 57, 1413–1457 (2006)
De la Torre, F., Black, M.: A framework for robust subspace learning. International Journal of Computer Vision 54(1–3), 117–142 (2003)
Ding, C., Zhou, D., He, X., Zha, H.: R1-pca: rotational invariant L1-norm principal component analysis for robust subspace factorization. In: Proceedings of International Conference on Machine Learning
Elhamifar, E., Vidal, R.: Sparse subspace clustering: Algorithm, theory, and applications. Pattern Analysis and Machine Intelligence, IEEE Transactions on 35(11), 2765–2781 (2013)
Fidler, S., Skocaj, D., Leonardis, A.: Combining reconstructive and discriminative subspace methods for robust classification and regression by subsampling. IEEE Transactions on Pattern Analysis and Machine Intelligence 28(3), 337–350 (2006)
Fidler, S., Skocaj, D., Leonardis, A.: Combining reconstructive and discriminative subspace methods for robust classification and regression by subsampling. IEEE Transactions on Pattern Analysis and Machine Intelligence 28(3), 337–350 (2006)
He, R., Hu, B.G., Yuan, X., Zheng, W.S.: Principal component analysis based on nonparametric maximum entropy. Neurocomputing 73, 1840–1952 (2010)
He, R., Sun, Z., Tan, T., Zheng, W.S.: Recovery of corrupted low-rank matrices via half-quadratic based nonconvex minimization. In: IEEE Conference on Computer Vision and Pattern Recognition, pp. 2889–2896 (2011)
He, R., Tan, T., Wang, L.: Recovery of corrupted low-rank matrix by implicit regularizers. IEEE Transactions on Pattern Analysis and Machine Intelligence, 36(4), 770–783 (2014)
He, R., Tan, T., Wang, L., Zheng, W.S.: ℓ 2, 1 regularized correntropy for robust feature selection. In: Proceedings of IEEE Conference on Computer Vision and Pattern Recognition, pp. 2504–2511 (2012)
He, R., Zheng, W.S., Hu, B.G.: Maximum correntropy criterion for robust face recognition. IEEE Transactions on Pattern Analysis and Machine Intelligence 33(8), 1561–1576 (2011)
He, R., Zheng, W.S., Hu, B.G., Kong, X.W.: A regularized correntropy framework for robust pattern recognition. Neural Computation 23(8), 2074–2100 (2011)
Jenssen, R., Eltoft, T., Girolami, M., Erdogmus, D.: Kernel maximum entropy data transformation and an enhanced spectral clustering algorithm. In: Neural Information Processing Systems NIPS (2006)
Ji, Y., Lin, T., Zha, H.: Mahalanobis distance based non-negative sparse representation for face recognition. In: Proceedings of International Conference on Machine Learning and Applications, pp. 41–46 (2009)
Li, M., Chen, X., Li, X., Ma, B., Vitanyi, M.: The similarity metric. IEEE Transactions Information Theory 50, 3250–3264 (2004)
Liu, R., Li, S.Z., Yuan, X., He, R.: Online determination of track loss using template inverse matching. In: International Workshop on Visual Surveillance-VS (2008)
Luenberger, D.: Optimization by vector space methods. Wiley (1969)
Mairal, J., Sapiro, G., Elad, M.: Learning multiscale sparse representations for image and video restoration. SIAM Multiscale Modeling & Simulation 7(1), 214–241 (2008)
Moulin, P., O’Sullivan, J.A.: Information-theoretic analysis of information hiding. IEEE Transactions on Information Theory 49(3), 563–593 (2003)
Nenadic, Z.: Information discriminant analysis: feature extraction with an information-theoretic objective. IEEE Transactions on Pattern Analysis and Machine Intelligence 29(8), 1394–1407 (2007)
Nikolova, M., NG, M.K.: Analysis of half-quadratic minimization methods for signal and image recovery. SIAM Journal on Scientific Computing 27(3), 937–966 (2005)
Nowak, R., Figueiredo, M.: Fast wavelet-based image deconvolution using the EM algorithm. In: Proceedings of Asilomar Conference on Signals, Systems, and Computers, vol. 1, pp. 371–375 (2001)
Parzen, E.: On the estimation of probability density function and the mode. The Annals of Mathematical Statistics 33, 1065–1076 (1962)
Pokharel, P.P., Liu, W., Principe, J.C.: A low complexity robust detector in impulsive noise. Signal Processing 89(10), 1902–1909 (2009)
Principe, J., Xu, D., Zhao, Q., Fisher, J.: Learning from examples with information-theoretic criteria. Journal of VLSI Signal Processing 26, 61–77 (2000)
P.Viola, N.Schraudolph, T.Sejnowski: Empirical entropy manipulation for real-world problems. In: Proceedings of Neural Information Processing Systems, pp. 851–857 (1995)
Rao, S., Liu, W., Principe, J.C., de Medeiros Martins, A.: Information theoretic mean shift algorithm. In: Machine Learning for Signal Processing (2006)
Renyi, A.: On measures of entropy and information. Selected Papers of Alfred Renyi 2, 565–580 (1976)
Rockfellar, R.: Convex analysis. Princeton Press (1970)
Sharma, A., Paliwal, K.: Fast principal component analysis using fixed-point algorithm. Pattern Recognition Letters 28, 1151–1155 (2007)
Shi, Q., Eriksson, A., van den Hengel, A., Shen, C.: Face recognition really a compressive sensing problem. In: Proceedings of IEEE Conference on Computer Vision and Pattern Recognition, pp. 553–560 (2011)
Takhar, D., Laska, J., Wakin, M., Duarte, M., Baron, D., Sarvotham, S., Kelly, K.,, Baraniuk, R.: A new compressive imaging camera architecture using optical-domain compression. In: Proceedings of Computational Imaging IV at SPIE Electronic Imaging, pp. 43–52 (2006)
Vinh, N.X., Epps, J., Bailey, J.: Information theoretic measures for clusterings comparison: Variants, properties, normalization and correction for chance. Journal of Machine Learning Research 11, 2837–2854 (2010)
Weiszfeld, E.: Sur le point pour lequel la somme des distances de n points donnes est minimum. Mathematical Journal 43, 355–386 (1937)
Xing, E.P., Ng, A.Y., Jordan, M.I., Russell, S.: Distance metric learning with application to clustering with side-information. In: Proceedings of Advances in Neural Information Processing Systems, vol. 15, pp. 505–512 (2002)
Xu, D.: Energy, entropy and information potential for neural computation. Ph.D. thesis, University of Florida (1999)
Yang, A.Y., Sastry, S.S., Ganesh, A., Ma, Y.: Fast ℓ 1-minimization algorithms and an application in robust face recognition: A review. In: Proceedings of International Conference on Image Processing (2010)
Yuan, X.T., Li, S.: Half quadratic analysis for mean shift: with extension to a sequential data mode-seeking method. In: IEEE International Conference on Computer Vision (2007)
Zhang, T.: Multi-stage convex relaxation for learning with sparse regularization. In: Proceedings of Neural Information Processing Systems, pp. 16–21 (2008)
Zhang, T.H., Tao, D.C., Li, X.L., Yang, J.: Patch alignment for dimensionality reduction. IEEE Trans. Knowl. Data Eng. 21(9), 1299–1313 (2009)
Zou, H.: The adaptive lasso and its oracle properties. Journal of the American Statistical Association 101(476), 1418–1429 (2006)
Zhang, Y., Sun, Z., He, R., Tan, T.: Robust subspace clustering via half-quadratic minimization. In: International Conference on Computer Vision (2013)
Author information
Authors and Affiliations
Rights and permissions
Copyright information
© 2014 The Author(s)
About this chapter
Cite this chapter
He, R., Hu, B., Yuan, X., Wang, L. (2014). Introduction. In: Robust Recognition via Information Theoretic Learning. SpringerBriefs in Computer Science. Springer, Cham. https://doi.org/10.1007/978-3-319-07416-0_1
Download citation
DOI: https://doi.org/10.1007/978-3-319-07416-0_1
Published:
Publisher Name: Springer, Cham
Print ISBN: 978-3-319-07415-3
Online ISBN: 978-3-319-07416-0
eBook Packages: Computer ScienceComputer Science (R0)