Dynamic Neighborhood Selection for Nonlinear Dimensionality Reduction

  • Yubin Zhan
  • Jianping Yin
  • Jun Long
Conference paper
Part of the Lecture Notes in Computer Science book series (LNCS, volume 5861)


Neighborhood construction is a necessary and important step in nonlinear dimensionality reduction algorithm. In this paper, we first summarize the two principles for neighborhood construction via analyzing existing nonlinear dimensionality reduction algorithms: 1) data points in the same neighborhood should approximately lie on a low dimensional linear subspace; and 2) each neighborhood should be as large as possible. Then a dynamic neighborhood selection algorithm based on this two principles is proposed in this paper. The proposed method exploits PCA technique to measure the linearity of a finite points set. Moreover, for isometric embedding, we present an improved method of constructing neighborhood graph, which can improve the accuracy of geodesic distance estimation. Experiments on both synthetic data sets and real data sets show that our method can construct neighborhood according to local curvature of data manifold and then improve the performance of most manifold algorithms, such as ISOMAP and LLE.


neighborhood construction manifold learning local linearity geodesic distance 


Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.


  1. 1.
    Jolliffe, I.T.: Principal Component Analysis. Springer, Heidelberg (1989)Google Scholar
  2. 2.
    Cox, T., Cox, M.: Multidimensional Scaling. Chapman and Hall, Boca Raton (1994)zbMATHGoogle Scholar
  3. 3.
    Tenenbaum, J.B., de Silva, V., Langford, J.C.: A Global Geometric Framework for Nonlinear Dimensionality Reduction. Science 290, 2319–2323 (2000)CrossRefGoogle Scholar
  4. 4.
    Saul, L.K., Roweis, S.T.: Think globally, fit locally: unsupervised learning of low dimensional manifolds. Journal of Machine Learning Research 4, 119–155 (2003)CrossRefMathSciNetGoogle Scholar
  5. 5.
    Roweis, S.T., Saul, L.K.: Nonlinear Dimensionality Reduction by Locally Linear Embedding. Science 290, 2323–2326 (2000)CrossRefGoogle Scholar
  6. 6.
    Belkin, M., Niyogi, P.: Laplacian eigenmaps for dimensionality reduction and data representation. Neural Comput. 15, 1373–1396 (2003)zbMATHCrossRefGoogle Scholar
  7. 7.
    Zhang, Z., Zha, H.: Principal Manifolds and Nonlinear Dimension Reduction via Local Tangent Space Alignment. SIAM J. Scientific Computing 26, 313–338 (2005)CrossRefMathSciNetGoogle Scholar
  8. 8.
    Balasubramanian, M., Shwartz, E.L., Tenenbaum, J.B., de Silva, V., Langford, J.C.: The Isomap Algorithm and Topological Stability. Science 295 (2002)Google Scholar
  9. 9.
    Yang, L.: Building k-edge-connected neighborhood graph for distance-based data projection. Pattern Recognit. Lett. 26, 2015–2021 (2005)CrossRefGoogle Scholar
  10. 10.
    Yang, L.: Building k-connected neighborhood graphs for isometric data embedding. IEEE Transactions on Pattern Analysis and Machine Intelligence 28, 827–831 (2006)CrossRefGoogle Scholar
  11. 11.
    Yang, L.: Building connected neighborhood graphs for isometric data embedding. In: Proceedings of the eleventh ACM SIGKDD international conference on Knowledge discovery in data mining. ACM, Chicago (2005)Google Scholar
  12. 12.
    Yang, L.: Building Connected Neighborhood Graphs for Locally Linear Embedding. In: 18th International Conference on Pattern Recognition. ICPR 2006, vol. 4, pp. 194–197 (2006)Google Scholar
  13. 13.
    Samko, O., Marshall, A.D., Rosin, P.L.: Selection of the optimal parameter value for the Isomap algorithm. Pattern Recognit. Lett. 27, 968–979 (2006)CrossRefGoogle Scholar
  14. 14.
    Xia, T., Li, J., Zhang, Y., Tang, S.: A More Topologically Stable Locally Linear Embedding Algorithm Based on R*-Tree. In: Washio, T., Suzuki, E., Ting, K.M., Inokuchi, A. (eds.) PAKDD 2008. LNCS (LNAI), vol. 5012, pp. 803–812. Springer, Heidelberg (2008)CrossRefGoogle Scholar
  15. 15.
    Shao, C., Huang, H., Zhao, L.: A More Topologically Stable ISOMAP Algorithm. Journal of Software 18, 869–877 (2007)zbMATHCrossRefGoogle Scholar
  16. 16.
    Shao, C., Huang, H., Wan, C.: Selection of the Suitable Neighborhood Size for the ISOMAP Algorithm. In: International Joint Conference on Neural Networks. IJCNN 2007, pp. 300–305 (2007)Google Scholar
  17. 17.
    Lin, T., Zha, H.: Riemannian Manifold Learning. IEEE Trans. Pattern Anal. Mach. Intell. 30, 796–809 (2008)CrossRefGoogle Scholar
  18. 18.
    Yan, S., Tang, X.: Largest-eigenvalue-theory for incremental principal component analysis. In: IEEE International Conference on Image Processing, vol. 1 (2005)Google Scholar

Copyright information

© Springer-Verlag Berlin Heidelberg 2009

Authors and Affiliations

  • Yubin Zhan
    • 1
  • Jianping Yin
    • 1
  • Jun Long
    • 1
  1. 1.Computer SchoolNational University of Defense TechnologyChangshaChina

Personalised recommendations