Dynamic Neighborhood Selection for Nonlinear Dimensionality Reduction
- 655 Downloads
Neighborhood construction is a necessary and important step in nonlinear dimensionality reduction algorithm. In this paper, we first summarize the two principles for neighborhood construction via analyzing existing nonlinear dimensionality reduction algorithms: 1) data points in the same neighborhood should approximately lie on a low dimensional linear subspace; and 2) each neighborhood should be as large as possible. Then a dynamic neighborhood selection algorithm based on this two principles is proposed in this paper. The proposed method exploits PCA technique to measure the linearity of a finite points set. Moreover, for isometric embedding, we present an improved method of constructing neighborhood graph, which can improve the accuracy of geodesic distance estimation. Experiments on both synthetic data sets and real data sets show that our method can construct neighborhood according to local curvature of data manifold and then improve the performance of most manifold algorithms, such as ISOMAP and LLE.
Keywordsneighborhood construction manifold learning local linearity geodesic distance
Unable to display preview. Download preview PDF.
- 1.Jolliffe, I.T.: Principal Component Analysis. Springer, Heidelberg (1989)Google Scholar
- 8.Balasubramanian, M., Shwartz, E.L., Tenenbaum, J.B., de Silva, V., Langford, J.C.: The Isomap Algorithm and Topological Stability. Science 295 (2002)Google Scholar
- 11.Yang, L.: Building connected neighborhood graphs for isometric data embedding. In: Proceedings of the eleventh ACM SIGKDD international conference on Knowledge discovery in data mining. ACM, Chicago (2005)Google Scholar
- 12.Yang, L.: Building Connected Neighborhood Graphs for Locally Linear Embedding. In: 18th International Conference on Pattern Recognition. ICPR 2006, vol. 4, pp. 194–197 (2006)Google Scholar
- 16.Shao, C., Huang, H., Wan, C.: Selection of the Suitable Neighborhood Size for the ISOMAP Algorithm. In: International Joint Conference on Neural Networks. IJCNN 2007, pp. 300–305 (2007)Google Scholar
- 18.Yan, S., Tang, X.: Largest-eigenvalue-theory for incremental principal component analysis. In: IEEE International Conference on Image Processing, vol. 1 (2005)Google Scholar