Abstract
This chapter seeks to outline and assess the various methods for the out-of-sample extension problem (incorporating new points from the high-dimensional space to the low-dimensional space) and the pre-image problem (incorporating new points from the low-dimensional space to the high-dimensional space). As well as this, methods for incremental learning—the process of producing a low-dimensional embedding as new data points appear as opposed to in batch—are discussed.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
References
Joliffe, I.T.: Principal Component Analysis. Springer-Verlag, New York (1986)
He, X., Niyogi, P.: Locality Preserving Projections. In: Advances in Neural Information Processing Systems 16: Proceedings of the 2003 Conference (NIPS), pp. 153–160. MIT Press (2003)
Belkin, M., Niyogi, P.: Laplacian eigenmaps and spectral techniques for embedding and clustering. In: Advances in Neural Information Processing Systems 14: Proceedings of the 2002 Conference (NIPS), pp. 585–591 (2002)
Zhang, T., Yang, J., Zhao, D., Ge, X.: Linear local tangent space alignment and application to face recognition. Neurocomputing 70, 1547–1533 (2007)
Shawe-Taylor, J., Christianini, N.: Kernel Methods for Pattern Analysis. Cambridge University Press (2004)
Bengio, Y., Paiement, J.F., Vincent, P., Delalleau, O., Roux, N.L., Ouimet, M.: Out-of-sample extensions for LLE, Isomap, MDS, Eigenmaps, and Spectral Clustering. In: Advances in Neural Information Processing Systems 15: Proceedings of the 2003 Conference (NIPS), pp. 177–184 (2003)
Bengio, Y., Delalleau, O., Roux, N.L., Paiement, J.F., Vincent, P., Ouimet, M.: Learning eigenfunctions links spectral embedding and Kernel PCA. Neural Computing 16(10), 2197–2219 (2004)
Bengio, Y., Vincent, P., Paiement, J., Delalleau, O., Ouimet, M., Roux, N.L.: Spectral clustering and kernel PCA anre learning eigenfunctions. Tech. rep., Département d’informatique et recherche opérationnelle, Université de Montréal (2003)
Ham, J., Lee, D.D., Mika, S., Schölkopf, B.: A kernel view of the dimensionality reduction of manifolds. In. In Proceedings of the 21st International Conference on Machine Learning, pp. 47–55 (2004)
Tenenbaum, J.B., de Silva, V., Langford, J.C.: A global geometric framework for nonlinear dimensionality reduction. Science 290, 2319–2322 (2000)
Roweis, S.T., Saul, L.K.: Nonlinear dimensionality reduction by Locally Linear Embedding. Science 290, 2323–2326 (2000)
Cox, T.F., Cox, M.A.A.: Multidimensional Scaling. Chapman and Hall (2001)
Saul, L.K., Roweis, S.: Think globally, fit locally: Unsupervised learning of low dimensional manifolds. Journal of Machine Learning Research 4, 119–155 (2003)
McLachlan, G., Basford, K.: Mixture Models: Inference and Applications to Clustering. Marcel Dekker (1988)
Dempster, A.P., Laird, N.M., Rubin, D.B.: Maximum likelihood from incomplete data via the EM algorithm. Journal of the Royal Statistical Society B 39, 1–37 (1977)
Strange, H., Zwiggelaar, R.: A generalised solution to the out-of-sample extension problem in manifold learning. In: Proceedings of the Twenty-Fifth AAAI Conference on Artificial Intelligence, pp. 471–476 (2011)
Yang, Y., Nie, F., Xiang, S., Zhuang, Y., Wan, W.: Local and Global Regressive Mapping for manifold learning with out-of-sample extrapolation. In: Proceedings of the Twenty-Fourth AAAI Conference on Artificial Intelligence, pp. 649–654 (2010)
Arias, P., Randall, G., Sapiro, G.: Connecting the out-of-sample and pre-image problems in kernel methods. In: Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition, pp. 524–531 (2007)
Thorstensen, N., Ségonne, F., Keriven, R.: Pre-Image as Karcher Mean using Diffusion Maps: Application to Shape and Image Denoising. In: Proceedings of the Second International Conference on Scale Space and Variational Methods in Computer Vision, pp. 721–732 (2009)
Huang, G.B., Ramesh, M., Berg, T., Learned-Miller, E.: Labeled faces in the wild: A database for studying face recognition in unconstrained environments. Tech. Rep. 07–49, University of Massachusetts, Amherst (2007)
Dambreville, S., Rathi, Y., Tannenbaum, A.: Statistical shape analysis using kernel PCA. In: Proceedings of the IS &T/SPIE Symposium on Electronic Imaging (2006)
Honeine, P., Richard, C.: Preimage problem in kernel-based machine learning. IEEE Signal Processing Magazine 28(2), 77–88 (2011)
Kwok, J., Tsang, I.: The pre-image problem in kernel methods. IEEE Transactions on Neural Networks 15, 1517–1525 (2004)
Mika, S., Schölkopf, B., Smola, A., Müller, K., Scholz, M., Rätsch, G.: Kernel PCA and de-noising in feature space. In: Advances in Neural Information Processing Systems 10: Proceedings of the 1998 Conference (NIPS), pp. 536–542 (1998)
Oja, E., Karhunen, J.: On stochastic approximation of the eigenvectors and eigenvalues of the expectation of a random matrix. Journal of Mathematical Analysis and Applications 106(1), 69–84 (1985)
Zhao, H., Yuen, P.C., Kwok, J.T.: A novel incremental principal component analysis and its application to face recognition. IEEE Transactions on Systems, Man, and Cybernetics - Part B: Cybernetics 36(4), 873–886 (2006)
Weng, J., Zhang, Y., Hwang, W.S.: Candid covariance-free incremental princiipal components analysis. IEEE Transactions on Pattern Analysis and Machine Intelligence 25(8), 1034–1040 (2003)
Law, M., Jain, A.: Incremental nonlinear dimensionality reduction by manifold learning. IEEE Transactions on Pattern Analysis and Machine Intelligence 28(3), 377–391 (2006)
Dijkstra, E.W.: A note on two problems in connexion with graphs. Numerische Mathematik 1, 269–271 (1959)
Golub, G.H., Loan, C.F.V.: Matrix Computations. Johns Hopkins University Press (1996)
Kouropteva, O., Okun, O., Pietikäinen, M.: Incremental locally linear embedding. Pattern Recognition 38, 1764–1767 (2005)
Jia, P., Yin, J., Huang, X., Hu, D.: Incremental Laplacian Eigenmaps by preserving adjacent information between data points. Pattern Recognition Letters 30, 1457–1463 (2009)
Li, H., Jiang, H., Barrio, R., Lia, X., Cheng, L., Su, F.: Incremental manifold learning by spectral embedding methods. Pattern Recognition Letters 32, 1447–1455 (2011)
Stewart, G.W.: Accelerating the orthogonal iteration for the eigenvectors of a hermitian matrix. Numerische Mathematik 13(4), 362–376 (1969)
Donoho, D.L., Grimes, C.: Hessian eigenmaps: Locally linear embedding techniques for high-dimensional data. Proceedings of the National Academy of Sciences of the United States of America (PNAS) 100(10), 5591–5596 (2003)
Author information
Authors and Affiliations
Corresponding author
Rights and permissions
Copyright information
© 2014 The Author(s)
About this chapter
Cite this chapter
Strange, H., Zwiggelaar, R. (2014). Incorporating New Points. In: Open Problems in Spectral Dimensionality Reduction. SpringerBriefs in Computer Science. Springer, Cham. https://doi.org/10.1007/978-3-319-03943-5_5
Download citation
DOI: https://doi.org/10.1007/978-3-319-03943-5_5
Published:
Publisher Name: Springer, Cham
Print ISBN: 978-3-319-03942-8
Online ISBN: 978-3-319-03943-5
eBook Packages: Computer ScienceComputer Science (R0)