Skip to main content

Optimal Landmark Selection for Nyström Approximation

  • Conference paper
Neural Information Processing (ICONIP 2014)

Part of the book series: Lecture Notes in Computer Science ((LNTCS,volume 8835))

Included in the following conference series:

  • 2460 Accesses

Abstract

The Nyström method is an efficient technique for large-scale kernel learning. It provides a low-rank matrix approximation to the full kernel matrix. The quality of Nyström approximation largely depends on the choice of landmark points. While standard method uniformly samples columns of the kernel matrix, improved sampling techniques have been proposed based on ensemble learning [1] and clustering [2]. These methods are focused on minimizing the approximation error for the original kernel. In this paper, we take a different perspective by minimizing the approximation error for the input vectors instead. We show under some restrictive condition that the new formulation is equivalent to the standard Nyström solution. This leads to a novel approach for optimizing landmark points for the Nyström approximation. Experimental results demonstrate the superior performance of the proposed landmark optimization method compared to existing Nyström methods in terms of lower approximation errors obtained.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 39.99
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 54.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

References

  1. Kumar, S., Mohri, M., Talwalkar, A.: Sampling methods for the nyström method. Journal of Machine Learning Research 13, 981–1006 (2012)

    MathSciNet  MATH  Google Scholar 

  2. Zhang, K., Kwok, J.T.: Clustered nyström method for large scale manifold learning and dimension reduction. IEEE Transactions on Neural Networks 21(10), 1576–1587 (2010)

    Article  Google Scholar 

  3. Scholkopf, B., Smola, A.J.: Learning with Kernels: Support Vector Machines, Regularization, Optimization and Beyond. MIT Express (2002)

    Google Scholar 

  4. Williams, C.K.I., Seeger, M.: Using the nyström method to speed up kernel machines. In: NIPS (2001)

    Google Scholar 

  5. Drineas, P., Mahoney, M.W.: On the nyström method for approximating a gram matrix for improved kernel-based learning. Journal of Machine Learning Research 6, 2153–2175 (2005)

    MathSciNet  MATH  Google Scholar 

  6. Bonnans, J.F., Shapiro, A.: Optimization problems with pertubation: A guided tour. SIAM Review 40(2), 202–227 (1998)

    Article  MathSciNet  Google Scholar 

  7. Rakotomamonjy, A., Bach, F.R., Canu, S., Grandvalet, Y.: Simplemkl. Journal of Machine Learning Research 9(11), 2491–2521 (2008)

    MathSciNet  MATH  Google Scholar 

  8. Fu, Z., Lu, G., Ting, K.M., Zhang, D.: Learning sparse kernel classifiers for multi-instance classification. IEEE Trans. Neural Networks 24(9), 1377–1389 (2013)

    Article  Google Scholar 

  9. Bach, F.R., Jordan, M.I.: Predictive low-rank decomposition for kernel methods. In: ICML, pp. 33–40 (2005)

    Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2014 Springer International Publishing Switzerland

About this paper

Cite this paper

Fu, Z. (2014). Optimal Landmark Selection for Nyström Approximation. In: Loo, C.K., Yap, K.S., Wong, K.W., Teoh, A., Huang, K. (eds) Neural Information Processing. ICONIP 2014. Lecture Notes in Computer Science, vol 8835. Springer, Cham. https://doi.org/10.1007/978-3-319-12640-1_38

Download citation

  • DOI: https://doi.org/10.1007/978-3-319-12640-1_38

  • Publisher Name: Springer, Cham

  • Print ISBN: 978-3-319-12639-5

  • Online ISBN: 978-3-319-12640-1

  • eBook Packages: Computer ScienceComputer Science (R0)

Publish with us

Policies and ethics