Skip to main content
Log in

Incremental Alignment Manifold Learning

  • Regular Paper
  • Published:
Journal of Computer Science and Technology Aims and scope Submit manuscript

Abstract

A new manifold learning method, called incremental alignment method (IAM), is proposed for nonlinear dimensionality reduction of high dimensional data with intrinsic low dimensionality. The main idea is to incrementally align low-dimensional coordinates of input data patch-by-patch to iteratively generate the representation of the entire dataset. The method consists of two major steps, the incremental step and the alignment step. The incremental step incrementally searches neighborhood patch to be aligned in the next step, and the alignment step iteratively aligns the low-dimensional coordinates of the neighborhood patch searched to generate the embeddings of the entire dataset. Compared with the existing manifold learning methods, the proposed method dominates in several aspects: high efficiency, easy out-of-sample extension, well metric-preserving, and averting of the local minima issue. All these properties are supported by a series of experiments performed on the synthetic and real-life datasets. In addition, the computational complexity of the proposed method is analyzed, and its efficiency is theoretically argued and experimentally demonstrated.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Institutional subscriptions

Similar content being viewed by others

References

  1. Donoho D L. High-dimensional data analysis: The curses and blessings of dimensionality. American Math. Society Lecture, Match Challenges of the 21st Century, 2000.

  2. Roweis S T, Saul L K. Nonlinear dimensionality reduction by locally linear embedding. Science, Dec. 2000, 290(5500): 2323–2326.

    Article  Google Scholar 

  3. Tenenbaum J B, de Silva V, Langford J C. A global geometric framework for nonlinear dimensionality reduction. Science, Dec. 2000, 290(5500): 2319–2323.

    Article  Google Scholar 

  4. Bachmann C M, Ainsworth T L, Fusina R A. Exploiting manifold geometry in hyperspectral imagery. IEEE Trans. Geoscience and Remote Sensing, Mar. 2005, 43(3): 441–454.

    Article  Google Scholar 

  5. Lee J G, Zhang C S. Classification of gene-expression data: The manifold-based metric learning way. Pattern Recognition, Dec. 2006, 39(12): 2450–2463.

    Article  MATH  Google Scholar 

  6. Shin Y. Facial expression recognition of various internal states via manifold learning. Journal of Computer Science and Technology, Jul. 2009, 24(4): 745–752.

    Article  Google Scholar 

  7. Belkin M, Niyogi P. Laplacian eigenmaps for dimensionality reduction and data representation. Neural Computation, 2003, 15(6): 1373–1396.

    Article  MATH  Google Scholar 

  8. Zhang Z, Zha H. Principal manifolds and nonlinear dimension reduction via local tangent space alignment. SIAM J. Scientific Computing, 2005, 26(1): 313–338.

    Article  MathSciNet  Google Scholar 

  9. Donoho D L, Grimes C. Hessian eigenmaps: New locally linear embedding techniques for high-dimensional data. Proc. the National Academy of Sciences, 2003, 100(10): 5591–5596.

    Article  MATH  MathSciNet  Google Scholar 

  10. Weinberger K, Saul L. Unsupervised learning of image manifolds by semidefinite programming. In Proc. IEEE Int. Conf. Computer Vision and Pattern Recognition, Washington DC, USA, Jun. 27-Jul. 2, 2004, pp. 988–995.

  11. Lee J A, Lendasse A, Verleysen M. Nonlinear projection with curvilinear distances: ISOMAP versus curvilinear distance analysis. Neurocomputing, Mar. 2004, 57: 49–76.

    Article  Google Scholar 

  12. Hinton G, Roweis S. Stochastic neighbor embedding. In Proc. NIPS 2002, Vancouver, Canada, Dec. 9-14, 2002, pp. 833–840.

  13. Agrafiotis D K, Xu H. A self-organizing principle for learning nonlinear manifolds. Proceedings of the National Academy of Sciences, 2002, 99(25): 15869–15872.

    Article  MATH  MathSciNet  Google Scholar 

  14. Yang L. Alignment of overlapping locally scaled patches for multidimensional scaling and dimensionality reduction. IEEE Trans. Pattern Analysis and Machine Intelligence, Mar. 2008, 30(3): 438–450.

    Article  Google Scholar 

  15. de Silva V, Tenenbaum J B. Global versus local methods in nonlinear dimensionality reduction. In Proc. NIPS 2003, Vancouver and Whistler, Canada, Dec. 8-13, 2003, pp. 705–712.

  16. Lin T, Zha H. Riemannian manifold learning. IEEE Trans. Pattern Analysis and Machine Intelligence, May, 2008, 30(5): 796–809.

    Article  Google Scholar 

  17. Roweis S T, Saul L K, Hinton G E. Global coordination of local linear models. In Proc. NIPS 2001, Vancouver, Canada, Dec. 3-8, 2001, pp. 889–896.

  18. Verbeek J. Learning nonlinear image manifolds by global alignment of local linear models. IEEE Trans. Pattern Analysis and Machine Intelligence, Aug. 2006, 28(8): 1236–1250.

    Article  Google Scholar 

  19. Bachmann C M, Alinsworth T L, Fusina R A. Exploiting manifold geometry in hyperspectral imagery. IEEE Trans. Geoscience and Remote Sensing, Mar. 2005, 43(3): 441–454.

    Article  Google Scholar 

  20. Teh Y W, Roweis S T. Automatic alignment of hidden representations. In Proc. NIPS 2002, Vancouver, Canada, Dec. 9-14, 2002, pp. 841–848.

  21. Verveek J, Roweis S, Vlassis N. Non-linear CCA and PCA by alignment of local models. In Proc. NIPS 2003, Vancouver and Whistler, Canada, Dec. 8-13, 2003, pp. 297–304.

  22. Zhang T, Yang J, Zhao D, Ge X. Linear local tangent space alignment and application to face recognition. Neuralcomputing, 2007, 70(7-9): 1547–1553.

    Article  Google Scholar 

  23. Cox T, Cox M. Multidimensional Scaling. Chapman and Hall, 1994.

  24. Law M H C, Zhang N, Jain A K. Nonlinear manifold learning for data stream. In Proc. SIAM Data Mining, Orlando, USA, Apr. 22-24, 2004, pp. 33–44.

  25. Law M H C, Jain A K. Incremental nonlinear dimensionality reduction by manifold learning. IEEE Trans. Pattern Analysis and Machine Intelligence, Mar. 2006, 28(3): 377–391.

    Article  Google Scholar 

  26. Kouropteva O, Okun O, PietikÄainen M. Incremental locally linear embedding. Pattern Recognition, 2005, 38(10): 1764–1767.

    Article  MATH  Google Scholar 

  27. Kouropteva O, Okun O, Pietikäinen M. Incremental locally linear embedding algorithm. In Proc. Fourteenth Scandinavian Conference on Image Analysis, Joensuu, Finland, Jun. 19-22, 2005, pp. 521–530.

  28. Bengio Y, Paiement J F, Vincent P, Delalleau O, Le Roux N, Ouimet M. Out-of-sample extensions for LLE, Isomap, MDS, Eigenmaps, and spectral clustering. In Proc. NIPS 2003, Vancouver and Whistler, Canada, Dec. 8-13, 2003, pp. 177–184.

  29. Zhao D, Yang L. Incremental isometric embedding of high dimensional data using connected neighborhood graphs. IEEE Trans. Pattern Analysis and Machine Intelligence, 2009, 31(1): 86–98.

    Article  Google Scholar 

  30. Jolliffe I T. Principal Component Analysis. Springer-Verlag, 1986.

  31. Yang J, Zhang D, Frangi A, Yang J. Two-dimentional PCA: A new approach to appearance-based face representation and recognition. IEEE Trans. Pattern Analysis and Machine Intelligence, Jan. 2004, 26(1): 131–137.

    Article  Google Scholar 

  32. Meng D, Leung Y, Fung T, Xu Z. Nonlinear dimensionality reduction of data lying on the multi-cluster manifold. IEEE Trans. Systems, Man and Cybernetics, Part B, Aug. 2008, 38(4): 1111–1122.

    Google Scholar 

  33. Meng D, Leung Y, Xu Z, Fung T, Zhang Q. Improving geodesic distance estimation based on locally linear assumption. Pattern Recognition Letters, May 2008, 29(7): 862–870.

    Article  Google Scholar 

  34. Lee J A, Verleysen M. Nonlinear dimensionality reduction of data manifolds with essential loops. Neurocomputing, 2005, 67: 29–53.

    Google Scholar 

  35. Saul L K, Roweis S T. Think globally, fit locally: Unsupervised learning of low dimensional manifold. Journal Machine Learning Research, 2003, 4: 119–155.

    Article  MathSciNet  Google Scholar 

  36. Friedman J H, Bentley J L, Finkel R A. An algorithm for finding best matches in logarithmic expected time. ACM Transactions on Mathematical Software, 1977, 3(3): 209–226.

    Article  MATH  Google Scholar 

  37. Nocedal J, Wright S J. Numerical Optimization, 2nd Ed. Berlin, New York: Springer-Verlag, 2006, p.24.

  38. de Silva V, Tenenbaum J B. Global versus local methods in nonlinear dimensionality reduction. In Proc. NIPS 2002, Vancouver, Canada, Dec. 9-14, 2002, pp. 705–712.

Download references

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to De-Yu Meng.

Additional information

This work was supported by the National Basic Research 973 Program of China under Grant No. 2007CB311002 and the National Natural Science Foundation of China under Grant No. 60905003.

Electronic supplementary material

Below is the link to the electronic supplementary material.

(PDF 79.3 kb)

Rights and permissions

Reprints and permissions

About this article

Cite this article

Han, Z., Meng, DY., Xu, ZB. et al. Incremental Alignment Manifold Learning. J. Comput. Sci. Technol. 26, 153–165 (2011). https://doi.org/10.1007/s11390-011-9422-9

Download citation

  • Received:

  • Revised:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s11390-011-9422-9

Keywords

Navigation