Skip to main content

Orthogonal Projection Analysis

  • Conference paper
Intelligent Science and Intelligent Data Engineering (IScIDE 2011)

Part of the book series: Lecture Notes in Computer Science ((LNIP,volume 7202))

Abstract

In this paper, we propose a novel linear dimensionality reduction algorithm, called Orthogonal Projection Analysis (OPA), from a gradient field perspective. Our approach is based on the following two criteria. First, the linear map should preserve the metric of the ambient space, which is based on the assumption that the metric of the ambient space is reliable. The second is the well-known smoothness criterion which is critical for clustering. Interestingly, gradient field is a natural tool to connect to these two requirements. We give a continuous objective function based on gradient fields and discuss how to discretize it by using tangent space. We also show the geometric meaning of our approach, which is requiring the gradient field as orthogonal as possible to the tangent spaces. The experimental results have demonstrated the effectiveness of our proposed approach.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 39.99
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 54.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

References

  1. Belkin, M., Niyogi, P.: Laplacian eigenmaps and spectral techniques for embedding and clustering. In: NIPS, pp. 585–591. MIT Press, Cambridge (2001)

    Google Scholar 

  2. Brand, M.: Charting a manifold. In: Advances in Neural Information Processing Systems, vol. 16 (2003)

    Google Scholar 

  3. Coifman, R.R., Lafon, S.: Diffusion maps. Applied and Computational Harmonic Analysis 21(1), 5–30 (2006), diffusion Maps and Wavelets

    Article  MathSciNet  MATH  Google Scholar 

  4. Cox, T., Cox, M.: Multidimensional Scalling. Chapman & Hall, London (1994)

    Google Scholar 

  5. Dollár, P., Rabaud, V., Belongie, S.: Non-isometric manifold learning: analysis and an algorithm. In: ICML 2007: Proceedings of the 24th International Conference on Machine Learning, pp. 241–248. ACM, New York (2007)

    Chapter  Google Scholar 

  6. Donoho, D.L., Grimes, C.E.: Hessian eigenmaps: Locally linear embedding techniques for high-dimensional data. Proceedings of the National Academy of Sciences of the United States of America 100(10), 5591–5596 (2003)

    Article  MathSciNet  MATH  Google Scholar 

  7. Duda, R.O., Hart, P.E., Stork, D.G.: Pattern Classification, 2nd edn. Wiley-Interscience, Hoboken (2000)

    Google Scholar 

  8. Goldberg, Y., Zakai, A., Kushnir, D., Ritov, Y.: Manifold learning: The price of normalization. The Journal of Machine Learning Research 9, 1909–1939 (2008)

    MathSciNet  MATH  Google Scholar 

  9. Golub, G.H., van Loan, C.F.: Matrix computations, 3rd edn. Johns Hopkins University Press (1996)

    Google Scholar 

  10. Ham, J., Lee, D.D., Mika, S., Schölkopf, B.: A kernel view of the dimensionality reduction of manifolds. In: Proceedings of the Twenty-first International Conference on Machine Learning, Banff, Alberta, Canada (2004)

    Google Scholar 

  11. Hujun, Y.: Advances in adaptive nonlinear manifolds and dimensionality reduction. Frontiers of Electrical and Electronic Engineering in China 6, 72–85 (2011)

    Article  Google Scholar 

  12. Jolliffe, I.T.: Principal Component Analysis. Springer, New York (1989)

    Google Scholar 

  13. Lafon, S., Lee, A.B.: Diffusion maps and coarse-graining: A unified framework for dimensionality reduction, graph partitioning, and data set parameterization. IEEE Transactions on Pattern Analysis and Machine Intelligence 28, 1393–1403 (2006)

    Article  Google Scholar 

  14. Lin, T., Zha, H.: Riemannian manifold learning. IEEE Transactions on Pattern Analysis and Machine Intelligence 30(5), 796–809 (2008)

    Article  Google Scholar 

  15. Nadler, B., Lafon, S., Coifman, R., Kevrekidis, I.: Diffusion maps, spectral clustering and eigenfunctions of fokker-planck operators. In: Weiss, Y., Schölkopf, B., Platt, J. (eds.) Advances in Neural Information Processing Systems, vol. 18, pp. 955–962. MIT Press, Cambridge (2006)

    Google Scholar 

  16. Roweis, S., Saul, L.: Nonlinear dimensionality reduction by locally linear embedding. Science 290(5500), 2323–2326 (2000)

    Article  Google Scholar 

  17. Schölkopf, B., Smola, A., Müller, K.R.: Nonlinear component analysis as a kernel eigenvalue problem. Neural Computation (10), 1299–1319 (1998)

    Google Scholar 

  18. Tenenbaum, J., de Silva, V., Langford, J.: A global geometric framework for nonlinear dimensionality reduction. Science 290(5500), 2319–2323 (2000)

    Article  Google Scholar 

  19. Weinberger, K.Q., Sha, F., Saul, L.K.: Learning a kernel matrix for nonlinear dimensionality reduction. In: ICML 2004: Proceedings of the Twenty-first International Conference on Machine Learning, p. 106. ACM, New York (2004)

    Chapter  Google Scholar 

  20. Zhang, Z., Zha, H.: Principal manifolds and nonlinear dimension reduction via local tangent space alignment. SIAM Journal of Scientific Computing 26(1) (2004)

    Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2012 Springer-Verlag Berlin Heidelberg

About this paper

Cite this paper

Lin, B., Zhang, C., He, X. (2012). Orthogonal Projection Analysis. In: Zhang, Y., Zhou, ZH., Zhang, C., Li, Y. (eds) Intelligent Science and Intelligent Data Engineering. IScIDE 2011. Lecture Notes in Computer Science, vol 7202. Springer, Berlin, Heidelberg. https://doi.org/10.1007/978-3-642-31919-8_1

Download citation

  • DOI: https://doi.org/10.1007/978-3-642-31919-8_1

  • Publisher Name: Springer, Berlin, Heidelberg

  • Print ISBN: 978-3-642-31918-1

  • Online ISBN: 978-3-642-31919-8

  • eBook Packages: Computer ScienceComputer Science (R0)

Publish with us

Policies and ethics