Abstract
High dimensional spaces pose a challenge to any classification task. In fact, these spaces contain much redundancy and it becomes crucial to reduce the dimensionality of the data to improve analysis, density modeling, and classification. In this paper, we present a method for dimensionality reduction in mixture models and its use in classification. For each component of the mixture, the data are projected by a linear transformation onto a lower-dimensional space. Subsequently, the projection matrices and the densities in such compressed spaces are learned by means of an Expectation Maximization (EM) algorithm. However, two main issues arise as a result of implementing this approach, namely: 1) the scale of the densities can be different across the mixture components and 2) a singularity problem may occur. We suggest solutions to these problems and validate the proposed method on three image data sets from the UCI Machine Learning Repository. The classification performance is compared with that of a mixture of probabilistic principal component analysers (MPPCA). Across the three data sets, our accuracy always compares favourably, with improvements ranging from 2.5% to 35.4%.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
Preview
Unable to display preview. Download preview PDF.
References
Bellman, R.: Adaptive Control Processes - A Guided Tour. Princeton University Press, Princeton (1961)
Bishop, C.M.: Pattern Recognition and Machine Learning. Springer, Heidelberg (2006)
Tipping, M.E., Bishop, C.M.: Probabilistic principal component analysis. Journal of the Royal Statistical Society: Series B (Statistical Methodology) 61(3), 611–622 (1999)
Bartholomew, D.J.: Latent Variable Models and Factor Analysis. Charles Griffin & Co. Ltd., London (1987)
Hinton, G.E., Dayan, P., Revow, M.: Modeling the manifolds of images of handwritten digits. IEEE Transactions on Neural Networks 8(1), 65–74 (1997)
Tipping, M.E., Bishop, C.M.: Mixtures of probabilistic principal component analyzers. Neural Computation 11(2), 443–482 (1999)
Ghahramani, Z., Hinton, G.E.: The em algorithm for mixtures of factor analyzers. Technical Report CRG-TR-96-1, University of Toronto (1997)
Piccardi, M., Gunes, H., Otoom, A.F.: Maximum-likelihood dimensionality reduction in gaussian mixture models with an application to object classification. In: 19th IEEE International Conference on Pattern Recognition, Tampa, FL, USA (2008)
Asuncion, A., Newman, D.J.: UCI machine learning repository (2007)
Kittler, J.V.: Combining classifiers: A theoretical framework. Pattern Analysis and Applications 1(1), 18–27 (1998)
Figueiredo, M.A.F., Jain, A.K.: Unsupervised learning of finite mixture models. IEEE Transactions on Pattern Analysis and Machine Intelligence 24(3), 381–396 (2002)
Bolton, R.J., Krzanowski, W.J.: A characterization of principal components for projection pursuit. The American Statistician 53(2), 108–109 (1999)
Zhong, P., Fukushima, M.: A regularized nonsmooth newton method for multi-class support vector machines. Technical report, Department of Applied Mathematics and Physics, Kyoto University (2005)
Mangasaria, O.L., Wild, E.W.: Nonlinear knowledge-based classification. Technical Report 06-04, Data Mining Institute (2006)
Keysers, D., Gollan, C., Ney, H.: Local context in non-linear deformation models for handwritten character recognition. In: International Conference on Pattern Recognition, vol. 4, pp. 511–514 (2004)
Author information
Authors and Affiliations
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2009 Springer-Verlag Berlin Heidelberg
About this paper
Cite this paper
Otoom, A.F., Perez Concha, O., Gunes, H., Piccardi, M. (2009). Mixtures of Normalized Linear Projections. In: Blanc-Talon, J., Philips, W., Popescu, D., Scheunders, P. (eds) Advanced Concepts for Intelligent Vision Systems. ACIVS 2009. Lecture Notes in Computer Science, vol 5807. Springer, Berlin, Heidelberg. https://doi.org/10.1007/978-3-642-04697-1_7
Download citation
DOI: https://doi.org/10.1007/978-3-642-04697-1_7
Publisher Name: Springer, Berlin, Heidelberg
Print ISBN: 978-3-642-04696-4
Online ISBN: 978-3-642-04697-1
eBook Packages: Computer ScienceComputer Science (R0)