Abstract
We study the problem of multimodal dimensionality reduction assuming that data samples can be missing at training time, and not all data modalities may be present at application time. Maximum covariance analysis, as a generalization of PCA, has many desirable properties, but its application to practical problems is limited by its need for perfectly paired data. We overcome this limitation by a latent variable approach that allows working with weakly paired data and is still able to efficiently process large datasets using standard numerical routines. The resulting weakly paired maximum covariance analysis often finds better representations than alternative methods, as we show in two exemplary tasks: texture discrimination and transfer learning.
Chapter PDF
Similar content being viewed by others
Keywords
- Dimensionality Reduction
- Linear Discriminant Analysis
- Local Binary Pattern
- Canonical Correlation Analysis
- Transfer Learning
These keywords were added by machine and not by the authors. This process is experimental and the keywords may be updated as the learning algorithm improves.
References
Andrews, S., Tsochantaridis, I., Hofmann, T.: Support vector machines for multiple-instance learning. In: NIPS (2003)
Bach, F.R., Jordan, M.I.: A probabilistic interpretation of canonical correlation analysis. Technical Report 688, Department of Statistics, University of California, Berkeley (2005)
Baudat, G., Anouar, F.: Generalized discriminant analysis using a kernel approach. Neural Computation 12(10) (2000)
Blaschko, M., Gretton, A.: Learning taxonomies by dependence maximization. In: NIPS (2009)
Blaschko, M., Lampert, C.H.: Correlational spectral clustering. In: CVPR (2008)
Fisher, R.A.: The use of multiple measurements in taxonomic problems. Annals of Eugenics 7 (1936)
Golub, G.H., Van Loan, C.F.: Matrix computations. Johns Hopkins Univ. Press, Baltinmore (1996)
Hardoon, D., Mourao-Miranda, J., Brammer, M., Shawe-Taylor, J.: Unsupervised analysis of fMRI data using kernel canonical correlation. NeuroImage 37(4) (2007)
Hardoon, D., Szedmak, S., Shawe-Taylor, J.: Canonical correlation analysis: an overview with application to learning methods. Neural Computation 16(12) (2004)
Hinton, G., Salakhutdinov, R.: Reducing the dimensionality of data with neural networks. Science 313(5786) (2006)
Hofmann, T.: Probabilistic latent semantic indexing. In: ACM SIGIR (1999)
Hotelling, H.: Relation between two sets of variates. Biometrika 28 (1936)
Jonker, R., Volgenant, A.: A shortest augmenting path algorithm for dense and sparse linear assignment problems. Computing 38(4) (1987)
Kuhn, H.W.: The hungarian method for the assignment problem. Naval Research Logistics Quarterly 2 (1955)
Lampert, C.H., Nickisch, H., Harmeling, S.: Learning to detect unseen object classes by between-class attribute transfer. In: CVPR (2009)
Lei, Z., Bai, Q., He, R., Li, S.Z.: Face shape recovery from a single image using CCA mapping between tensor spaces. In: CVPR (2008)
Lienhart, R., Romberg, S., Hörster, E.: Multilayer pLSA for multimodal image retrieval. In: CIVR (2009)
Livescu, K., Stoehr, M.: Multi-view learning of acoustic features for speaker recognition. In: Automatic Speech Recognition and Understanding (2009)
Logan, B.: Mel frequency cepstral coefficients for music modeling. In: International Symposium on Music Information Retrieval (2000)
MacQueen, J.: Some methods for classification and analysis of multivariate observations. In: 5th Berkeley Symposium on Mathematics Statistics and Probability (1967)
Ojala, T., Pietikainen, M., Maenpaa, T.: Multiresolution gray-scale and rotation invariant texture classification with local binary patterns. PAMI 24(7) (2002)
Pan, S.J., Kwok, J.T., Yang, Q.: Transfer learning via dimensionality reduction. In: AAAI (2008)
Pan, S.J., Yang, Q.: A survey on transfer learning. Knowledge and Data Engineering (2009)
Papadimitriou, C.H., Raghavan, P., Tamaki, H., Vempala, S.: Latent semantic indexing: A probabilistic analysis. Computer and System Sciences 61(2) (2000)
Pearson, K.: On lines and planes of closest fit to systems of points in space. Philosophical Magazine Series 6. 2(11) (1901)
Rosenberg, A., Hirschberg, J.: V-measure: A conditional entropy-based external cluster evaluation measure. In: EMNLP-CoNLL (2007)
Rosipal, R., Trejo, L.J.: Kernel partial least squares regression in reproducing kernel Hilbert space. JMLR 2 (2002)
Schölkopf, B., Smola, A.J.: Learning with kernels. MIT Press, Cambridge (2002)
Tang, K., Tappen, M., Sukthankar, R., Lampert, C.H.: Optimizing one-shot recognition with micro-set learning. In: CVPR (2010)
Tenenbaum, J.B., Silva, V., Langford, J.C.: A global geometric framework for nonlinear dimensionality reduction. Science 290(5500) (2000)
Tucker, L.R.: An inter-battery method of factor analysis. Psychometrika 23 (1958)
Tuytelaars, T., Lampert, C.H., Blaschko, M.B., Buntine, W.: Unsupervised object discovery: A comparison. IJCV 88(2) (2010)
Wold, H.: Estimation of principal components and related models by iterative least squares. Multivariate Analysis 1 (1966)
Author information
Authors and Affiliations
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2010 Springer-Verlag Berlin Heidelberg
About this paper
Cite this paper
Lampert, C.H., Krömer, O. (2010). Weakly-Paired Maximum Covariance Analysis for Multimodal Dimensionality Reduction and Transfer Learning. In: Daniilidis, K., Maragos, P., Paragios, N. (eds) Computer Vision – ECCV 2010. ECCV 2010. Lecture Notes in Computer Science, vol 6312. Springer, Berlin, Heidelberg. https://doi.org/10.1007/978-3-642-15552-9_41
Download citation
DOI: https://doi.org/10.1007/978-3-642-15552-9_41
Publisher Name: Springer, Berlin, Heidelberg
Print ISBN: 978-3-642-15551-2
Online ISBN: 978-3-642-15552-9
eBook Packages: Computer ScienceComputer Science (R0)