High Dimensional Correspondences from Low Dimensional Manifolds – An Empirical Comparison of Graph-Based Dimensionality Reduction Algorithms

  • Ribana Roscher
  • Falko Schindler
  • Wolfgang Förstner
Part of the Lecture Notes in Computer Science book series (LNCS, volume 6469)


We discuss the utility of dimensionality reduction algorithms to put data points in high dimensional spaces into correspondence by learning a transformation between assigned data points on a lower dimensional structure. We assume that similar high dimensional feature spaces are characterized by a similar underlying low dimensional structure. To enable the determination of an affine transformation between two data sets we make use of well-known dimensional reduction algorithms. We demonstrate this procedure for applications like classification and assignments between two given data sets and evaluate six well-known algorithms during several experiments with different objectives. We show that with these algorithms and our transformation approach high dimensional data sets can be related to each other. We also show that linear methods turn out to be more suitable for assignment tasks, whereas graph-based methods appear to be superior for classification tasks.


Dimensionality Reduction High Dimensional Space Locally Linear Embedding Kernel Principal Component Analysis Handwritten Digit 
These keywords were added by machine and not by the authors. This process is experimental and the keywords may be updated as the learning algorithm improves.


Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.


  1. 1.
    Bach, F.R., Jordan, M.I.: Spectral Clustering for Speech Separation. Wiley, Chichester (2009)CrossRefGoogle Scholar
  2. 2.
    Mittal, A., Monnet, A., Paragios, N.: Scene Modeling and Change Detection in Dynamic Scenes: A Subspace Approach. In: CVUI, vol. 113 (2009)Google Scholar
  3. 3.
    Rao, S., Tron, R., Vidal, R., Ma, Y.: Motion segmentation via robust subspace separation in the presence of outlying, incomplete, or corrupted trajectories. In: CVPR, vol. 37, p. 18 (2008)Google Scholar
  4. 4.
    Murase, H.: Moving Object Recognition in Eigenspace Representation: Gait Analysis and Lip Reading. Pattern Recognition Letters 17, 155–162 (1996)CrossRefGoogle Scholar
  5. 5.
    Jolliffe, I.T.: Principal Component Analysis. Springer, Heidelberg (2002)zbMATHGoogle Scholar
  6. 6.
    Cox, T.F., Cox, M.A.: Multidimensional Scaling, vol. 30. Chapman & Hall, Sydney (1994)zbMATHGoogle Scholar
  7. 7.
    Tenenbaum, J.B., Silva, V., Langford, J.C.: A Global Geometric Framework for Nonlinear Dimensionality Reduction. Science 290, 2319 (2000)CrossRefGoogle Scholar
  8. 8.
    Roweis, S.T., Saul, L.K.: Nonlinear Dimensionality Reduction by Locally Linear Embedding. Science (2000)Google Scholar
  9. 9.
    Belkin, M., Niyogi, P.: Laplacian Eigenmaps for Dimensionality Reduction and Data Representation. Neural Computation 15, 1373–1396 (2003)CrossRefzbMATHGoogle Scholar
  10. 10.
    Nadler, B., Lafon, S., Coifman, R.R.: Diffusion Maps, Spectral Clustering and Reaction Coordinates of Dynamical Systems. Applied and Computational Harmonic Analysis 21, 113–127 (2006)MathSciNetCrossRefzbMATHGoogle Scholar
  11. 11.
    Weinberger, K.Q., Saul, L.K.: Unsupervised Learning of Image Manifolds by Semidefinite Programming. IJCV 70, 77–90 (2006)CrossRefGoogle Scholar
  12. 12.
    Ham, J., Lee, D.D., Mika, S., Schölkopf, B.: A Kernel View of the Dimensionality Reduction of Manifolds. In: ICML, vol. 47 (2004)Google Scholar
  13. 13.
    Schölkopf, B., Smola, A., Müller, K.: Kernel Principal Component Analysis. MIT Press, Cambridge (1999)Google Scholar
  14. 14.
    De Silva, V., Tenenbaum, J.B.: Global versus Local Methods in Nonlinear Dimensionality Reduction. In: NIPS (2003)Google Scholar
  15. 15.
    Weinberger, K.Q., Packer, B.D., Saul, L.K.: Nonlinear Dimensionality Reduction by Semidefinite Programming and Kernel Matrix Factorization. In: International Workshop on Artificial Intelligence and Statistics, pp. 381–388 (2005)Google Scholar
  16. 16.
    Chang, H., Yeung, D.Y.: Robust Locally Linear Embedding. Pattern Recognition 39, 1053–1065 (2006)CrossRefzbMATHGoogle Scholar
  17. 17.
    Zhang, Z., Zha, H.: Principal Manifolds and Nonlinear Dimension Reduction via Local Tangent Space Alignment. SIAM Journal of Scientific Computing (2004)Google Scholar
  18. 18.
    Donoho, D.L., Grimes, C.: Hessian Eigenmaps: Locally Linear Embedding Techniques for High-Dimensional Data. National Academy of Sciences 100 (2003)Google Scholar
  19. 19.
    Saul, L.K., Roweis, S.T.: Think Globally, Fit Locally: Unsupervised Learning of Low Dimensional Manifolds. JMLR 4, 119–155 (2003)MathSciNetzbMATHGoogle Scholar
  20. 20.
    Ham, J., Lee, D., Saul, L.: Learning High Dimensional Correspondences from Low Dimensional Manifolds. In: ICML (2003)Google Scholar
  21. 21.
    Tenenbaum, J., Freeman, W.: Separating Style and Content with Bilinear Models. Neural Computation 12 (2000)Google Scholar
  22. 22.
    De la Torre, F., Black, M.: Dynamic coupled component analysis. In: CVPR (2005)Google Scholar
  23. 23.
    Wang, C., Mahadevan, S.: Manifold Alignment Using Procrustes Analysis. In: ICML (2008)Google Scholar
  24. 24.
    Lee, M.: Algorithms for Representing Similarity Data (1999)Google Scholar
  25. 25.
    Seewald, A.K.: Digits–A dataset for Handwritten Digit Recognition. TR (2005)Google Scholar

Copyright information

© Springer-Verlag Berlin Heidelberg 2011

Authors and Affiliations

  • Ribana Roscher
    • 1
  • Falko Schindler
    • 1
  • Wolfgang Förstner
    • 1
  1. 1.Department of Photogrammetry, Institute of Geodesy and GeoinformationUniversity of BonnGermany

Personalised recommendations