Skip to main content

Incorporating New Points

  • Chapter
  • First Online:
Book cover Open Problems in Spectral Dimensionality Reduction

Part of the book series: SpringerBriefs in Computer Science ((BRIEFSCOMPUTER))

  • 980 Accesses

Abstract

This chapter seeks to outline and assess the various methods for the out-of-sample extension problem (incorporating new points from the high-dimensional space to the low-dimensional space) and the pre-image problem (incorporating new points from the low-dimensional space to the high-dimensional space). As well as this, methods for incremental learning—the process of producing a low-dimensional embedding as new data points appear as opposed to in batch—are discussed.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 39.99
Price excludes VAT (USA)
  • Available as EPUB and PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 54.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

References

  1. Joliffe, I.T.: Principal Component Analysis. Springer-Verlag, New York (1986)

    Google Scholar 

  2. He, X., Niyogi, P.: Locality Preserving Projections. In: Advances in Neural Information Processing Systems 16: Proceedings of the 2003 Conference (NIPS), pp. 153–160. MIT Press (2003)

    Google Scholar 

  3. Belkin, M., Niyogi, P.: Laplacian eigenmaps and spectral techniques for embedding and clustering. In: Advances in Neural Information Processing Systems 14: Proceedings of the 2002 Conference (NIPS), pp. 585–591 (2002)

    Google Scholar 

  4. Zhang, T., Yang, J., Zhao, D., Ge, X.: Linear local tangent space alignment and application to face recognition. Neurocomputing 70, 1547–1533 (2007)

    Google Scholar 

  5. Shawe-Taylor, J., Christianini, N.: Kernel Methods for Pattern Analysis. Cambridge University Press (2004)

    Google Scholar 

  6. Bengio, Y., Paiement, J.F., Vincent, P., Delalleau, O., Roux, N.L., Ouimet, M.: Out-of-sample extensions for LLE, Isomap, MDS, Eigenmaps, and Spectral Clustering. In: Advances in Neural Information Processing Systems 15: Proceedings of the 2003 Conference (NIPS), pp. 177–184 (2003)

    Google Scholar 

  7. Bengio, Y., Delalleau, O., Roux, N.L., Paiement, J.F., Vincent, P., Ouimet, M.: Learning eigenfunctions links spectral embedding and Kernel PCA. Neural Computing 16(10), 2197–2219 (2004)

    Google Scholar 

  8. Bengio, Y., Vincent, P., Paiement, J., Delalleau, O., Ouimet, M., Roux, N.L.: Spectral clustering and kernel PCA anre learning eigenfunctions. Tech. rep., Département d’informatique et recherche opérationnelle, Université de Montréal (2003)

    Google Scholar 

  9. Ham, J., Lee, D.D., Mika, S., Schölkopf, B.: A kernel view of the dimensionality reduction of manifolds. In. In Proceedings of the 21st International Conference on Machine Learning, pp. 47–55 (2004)

    Google Scholar 

  10. Tenenbaum, J.B., de Silva, V., Langford, J.C.: A global geometric framework for nonlinear dimensionality reduction. Science 290, 2319–2322 (2000)

    Google Scholar 

  11. Roweis, S.T., Saul, L.K.: Nonlinear dimensionality reduction by Locally Linear Embedding. Science 290, 2323–2326 (2000)

    Google Scholar 

  12. Cox, T.F., Cox, M.A.A.: Multidimensional Scaling. Chapman and Hall (2001)

    Google Scholar 

  13. Saul, L.K., Roweis, S.: Think globally, fit locally: Unsupervised learning of low dimensional manifolds. Journal of Machine Learning Research 4, 119–155 (2003)

    Google Scholar 

  14. McLachlan, G., Basford, K.: Mixture Models: Inference and Applications to Clustering. Marcel Dekker (1988)

    Google Scholar 

  15. Dempster, A.P., Laird, N.M., Rubin, D.B.: Maximum likelihood from incomplete data via the EM algorithm. Journal of the Royal Statistical Society B 39, 1–37 (1977)

    Google Scholar 

  16. Strange, H., Zwiggelaar, R.: A generalised solution to the out-of-sample extension problem in manifold learning. In: Proceedings of the Twenty-Fifth AAAI Conference on Artificial Intelligence, pp. 471–476 (2011)

    Google Scholar 

  17. Yang, Y., Nie, F., Xiang, S., Zhuang, Y., Wan, W.: Local and Global Regressive Mapping for manifold learning with out-of-sample extrapolation. In: Proceedings of the Twenty-Fourth AAAI Conference on Artificial Intelligence, pp. 649–654 (2010)

    Google Scholar 

  18. Arias, P., Randall, G., Sapiro, G.: Connecting the out-of-sample and pre-image problems in kernel methods. In: Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition, pp. 524–531 (2007)

    Google Scholar 

  19. Thorstensen, N., Ségonne, F., Keriven, R.: Pre-Image as Karcher Mean using Diffusion Maps: Application to Shape and Image Denoising. In: Proceedings of the Second International Conference on Scale Space and Variational Methods in Computer Vision, pp. 721–732 (2009)

    Google Scholar 

  20. Huang, G.B., Ramesh, M., Berg, T., Learned-Miller, E.: Labeled faces in the wild: A database for studying face recognition in unconstrained environments. Tech. Rep. 07–49, University of Massachusetts, Amherst (2007)

    Google Scholar 

  21. Dambreville, S., Rathi, Y., Tannenbaum, A.: Statistical shape analysis using kernel PCA. In: Proceedings of the IS &T/SPIE Symposium on Electronic Imaging (2006)

    Google Scholar 

  22. Honeine, P., Richard, C.: Preimage problem in kernel-based machine learning. IEEE Signal Processing Magazine 28(2), 77–88 (2011)

    Google Scholar 

  23. Kwok, J., Tsang, I.: The pre-image problem in kernel methods. IEEE Transactions on Neural Networks 15, 1517–1525 (2004)

    Google Scholar 

  24. Mika, S., Schölkopf, B., Smola, A., Müller, K., Scholz, M., Rätsch, G.: Kernel PCA and de-noising in feature space. In: Advances in Neural Information Processing Systems 10: Proceedings of the 1998 Conference (NIPS), pp. 536–542 (1998)

    Google Scholar 

  25. Oja, E., Karhunen, J.: On stochastic approximation of the eigenvectors and eigenvalues of the expectation of a random matrix. Journal of Mathematical Analysis and Applications 106(1), 69–84 (1985)

    Google Scholar 

  26. Zhao, H., Yuen, P.C., Kwok, J.T.: A novel incremental principal component analysis and its application to face recognition. IEEE Transactions on Systems, Man, and Cybernetics - Part B: Cybernetics 36(4), 873–886 (2006)

    Google Scholar 

  27. Weng, J., Zhang, Y., Hwang, W.S.: Candid covariance-free incremental princiipal components analysis. IEEE Transactions on Pattern Analysis and Machine Intelligence 25(8), 1034–1040 (2003)

    Google Scholar 

  28. Law, M., Jain, A.: Incremental nonlinear dimensionality reduction by manifold learning. IEEE Transactions on Pattern Analysis and Machine Intelligence 28(3), 377–391 (2006)

    Google Scholar 

  29. Dijkstra, E.W.: A note on two problems in connexion with graphs. Numerische Mathematik 1, 269–271 (1959)

    Google Scholar 

  30. Golub, G.H., Loan, C.F.V.: Matrix Computations. Johns Hopkins University Press (1996)

    Google Scholar 

  31. Kouropteva, O., Okun, O., Pietikäinen, M.: Incremental locally linear embedding. Pattern Recognition 38, 1764–1767 (2005)

    Google Scholar 

  32. Jia, P., Yin, J., Huang, X., Hu, D.: Incremental Laplacian Eigenmaps by preserving adjacent information between data points. Pattern Recognition Letters 30, 1457–1463 (2009)

    Google Scholar 

  33. Li, H., Jiang, H., Barrio, R., Lia, X., Cheng, L., Su, F.: Incremental manifold learning by spectral embedding methods. Pattern Recognition Letters 32, 1447–1455 (2011)

    Google Scholar 

  34. Stewart, G.W.: Accelerating the orthogonal iteration for the eigenvectors of a hermitian matrix. Numerische Mathematik 13(4), 362–376 (1969)

    Google Scholar 

  35. Donoho, D.L., Grimes, C.: Hessian eigenmaps: Locally linear embedding techniques for high-dimensional data. Proceedings of the National Academy of Sciences of the United States of America (PNAS) 100(10), 5591–5596 (2003)

    Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Harry Strange .

Rights and permissions

Reprints and permissions

Copyright information

© 2014 The Author(s)

About this chapter

Cite this chapter

Strange, H., Zwiggelaar, R. (2014). Incorporating New Points. In: Open Problems in Spectral Dimensionality Reduction. SpringerBriefs in Computer Science. Springer, Cham. https://doi.org/10.1007/978-3-319-03943-5_5

Download citation

  • DOI: https://doi.org/10.1007/978-3-319-03943-5_5

  • Published:

  • Publisher Name: Springer, Cham

  • Print ISBN: 978-3-319-03942-8

  • Online ISBN: 978-3-319-03943-5

  • eBook Packages: Computer ScienceComputer Science (R0)

Publish with us

Policies and ethics