Skip to main content

Dimensionality Reduction by Low-Rank Embedding

  • Conference paper
Intelligent Science and Intelligent Data Engineering (IScIDE 2012)

Part of the book series: Lecture Notes in Computer Science ((LNIP,volume 7751))

Abstract

We consider the dimensionality reduction task under the scenario that data vectors lie on (or near by) multiple independent linear subspaces. We propose a robust dimensionality reduction algorithm, named as Low-Rank Embedding(LRE). In LRE, the affinity weights are calculated via low-rank representation and the embedding is yielded by spectral method. Owing to the affinity weight induced from low-rank model, LRE can reveal the subtle multiple subspace structure robustly. In the virtual of spectral method, LRE transforms the subtle multiple subspaces structure into multiple clusters in the low dimensional Euclidean space in which most of the ordinary algorithms can perform well. To demonstrate the advantage of the proposed LRE, we conducted comparative experiments on toy data sets and benchmark data sets. Experimental results confirmed that LRE is superior to other algorithms.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 39.99
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 54.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

References

  1. Tenenbaum, J.B., Silva, V., Langford, J.C.: A global geometric framework for nonlinear dimensionality reduction. Science 290, 2319–2323 (2000)

    Article  Google Scholar 

  2. Roweis, S.T., Saul, L.K.: Nonlinear dimensionality reduction by locally linear embedding. Science 290, 2323–2326 (2000)

    Article  Google Scholar 

  3. Saul, L.K., Roweis, S.T.: Think globally, fit locally: unsupervised learning of low dimensional manifolds. Journal of Machine Learning Research 4, 119–155 (2003)

    MathSciNet  Google Scholar 

  4. Belkin, M., Niyogi, P.: Laplacian eigenmaps for dimensionality reduction and data representation. Neural Computation 15, 1373–1396 (2003)

    Article  MATH  Google Scholar 

  5. Donoho, D.L., Grimes, C.: Hessian eigenmaps: Locally linear embedding techniques for high-dimensional data. Proc. Natl. Acad. Sci. USA 100, 5591–5596 (2003)

    Article  MathSciNet  MATH  Google Scholar 

  6. Brand, M.: Charting a manifold. In: NIPS, vol. 15. MIT Press, Cambridge (2003)

    Google Scholar 

  7. Zhang, Z., Zha, H.: Principal manifolds and nonlinear dimensionality reduction by local tangent space alignment. SIAM Journal of Scientific Computing 26, 313–338 (2004)

    Article  MathSciNet  MATH  Google Scholar 

  8. Weinberger, K., Packer, B., Saul, L.: Unsupervised learning of image manifolds by semidefinite programming. In: CVPR 2004, vol. 2, pp. 988–995 (2004)

    Google Scholar 

  9. Brun, A., Westin, C., Herberthson, M., Knutsson, H.: Fast manifold learning based on riemannian normal coordinates. In: Proc. 14th Scandinavian Conf. on Image Analysis, Joensuu, Finland (2005)

    Google Scholar 

  10. Lin, T., Zha, H.: Riemannian manifold learning. IEEE Trans. on PAMI 30, 796–809 (2008)

    Article  Google Scholar 

  11. Coifman, R.R., Lafon, S., Lee, A.B., Maggioni, M., Nadler, B., Warner, F., Zucker, S.W.: Geometric diffusions as a tool for harmonic analysis and structure definition of data: Diffusion maps. Proc. of the Natl. Academy of Sciences 102, 7426–7431 (2005)

    Article  Google Scholar 

  12. de Silva, V., Tenenbaum, J.B.: Global versus local methods in nonlinear dimensionality reduction. In: Advances in Neural Information Processing Systems, vol. 15, pp. 705–712 (2002)

    Google Scholar 

  13. Sha, F., Saul, L.K.: Analysis and extension of spectral methods for nonlinear dimensionality reduction. In: ICML 2005: Proceedings of the 22nd International Conference on Machine Learning, pp. 784–791. ACM, New York (2005)

    Google Scholar 

  14. Zha, H.Z., Zhang, Z.: Isometric embedding and continuum isomap. In: Proceedings of the Twentieth International Conference on Machine Learning, pp. 864–871 (2003)

    Google Scholar 

  15. Liu, G., Lin, Z., Yu, Y.: Robust subspace segmentation by low-rank representation. In: ICML (2010)

    Google Scholar 

  16. Cayton, L.: Algorithms for manifold learning. Technical report, UCSD (June 15, 2005)

    Google Scholar 

  17. Cheng, H., Liu, Z., Yang, J.: Sparsity induced similarity measure for label propagation. In: ICCV (2009)

    Google Scholar 

  18. Yan, S., Wang, H.: Semi-supervised learning by sparse representation. In: SIAM International Conference on Data Mining (2009)

    Google Scholar 

  19. Candès, E.J., Li, X., Ma, Y., Wright, J.: Robust principal component analysis (2009) (preprint)

    Google Scholar 

  20. Candès, E.J., Recht, B.: Exact matrix completion via convex optimization. Foundations of Computational Mathematics (2009)

    Google Scholar 

  21. Keshavan, R., Montanari, A., Oh, S.: Matrix completion from noisy entries. In: NIPS (2009)

    Google Scholar 

  22. Lin, Z., Chen, M., Wu, L., Ma, Y.: The augmented lagrange multiplier method for exact recovery of corrupted low-rank matrices. Technical Report UILU-ENG-09-2215, UIUC (2009)

    Google Scholar 

  23. Wright, J., Yang, A.Y., Ganesh, A., Sastry, S.S., Ma, Y.: Robust face recognition via sparse representation. IEEE Trans. Pattern Anal. Mach. Intell. 31, 210–227 (2009)

    Article  Google Scholar 

  24. Zhu, X., Ghahramani, Z., Lafferty, J.D.: Semi-supervised learning using gaussian fields and harmonic functions. In: ICML, pp. 912–919 (2003)

    Google Scholar 

  25. Zelnik-Manor, L., Perona, P.: Self-tuning spectral clustering. In: NIPS (2004)

    Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2013 Springer-Verlag Berlin Heidelberg

About this paper

Cite this paper

Li, CG., Qi, X., Guo, J. (2013). Dimensionality Reduction by Low-Rank Embedding. In: Yang, J., Fang, F., Sun, C. (eds) Intelligent Science and Intelligent Data Engineering. IScIDE 2012. Lecture Notes in Computer Science, vol 7751. Springer, Berlin, Heidelberg. https://doi.org/10.1007/978-3-642-36669-7_23

Download citation

  • DOI: https://doi.org/10.1007/978-3-642-36669-7_23

  • Publisher Name: Springer, Berlin, Heidelberg

  • Print ISBN: 978-3-642-36668-0

  • Online ISBN: 978-3-642-36669-7

  • eBook Packages: Computer ScienceComputer Science (R0)

Publish with us

Policies and ethics