Skip to main content

Efficiently Learning the Metric with Side-Information

  • Conference paper
Book cover Algorithmic Learning Theory (ALT 2003)

Part of the book series: Lecture Notes in Computer Science ((LNAI,volume 2842))

Included in the following conference series:

Abstract

A crucial problem in machine learning is to choose an appropriate representation of data, in a way that emphasizes the relations we are interested in. In many cases this amounts to finding a suitable metric in the data space. In the supervised case, Linear Discriminant Analysis (LDA) can be used to find an appropriate subspace in which the data structure is apparent. Other ways to learn a suitable metric are found in [6] and [11]. However recently significant attention has been devoted to the problem of learning a metric in the semi-supervised case. In particular the work by Xing et al. [15] has demonstrated how semi-definite programming (SDP) can be used to directly learn a distance measure that satisfies constraints in the form of side-information. They obtain a significant increase in clustering performance with the new representation. The approach is very interesting, however, the computational complexity of the method severely limits its applicability to real machine learning tasks. In this paper we present an alternative solution for dealing with the problem of incorporating side-information. This side-information specifies pairs of examples belonging to the same class. The approach is based on LDA, and is solved by the efficient eigenproblem. The performance reached is very similar, but the complexity is only O(d 3) instead of O(d 6) where d is the dimensionality of the data. We also show how our method can be extended to deal with more general types of side-information.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 39.99
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 54.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

References

  1. Bach, F.R., Jordan, M.I.: Kernel independent component analysis. Journal of Machine Learning Research 3, 1–48 (2002)

    Article  MathSciNet  Google Scholar 

  2. Barker, M., Rayens, W.S.: Partial least squares for discrimination. Journal of Chemometrics 17, 166–173 (2003)

    Article  Google Scholar 

  3. Bartlett, M.S.: Further aspects of the theory of multiple regression. Proc. Camb. Philos. Soc. 34, 33–40 (1938)

    Article  Google Scholar 

  4. Borga, M., Landelius, T., Knutsson, H.: A Unified Approach to PCA, PLS, MLR and CCA. Report LiTH-ISY-R-1992, ISY, SE-581 83 Linköping, Sweden (November 1997)

    Google Scholar 

  5. Bradley, P., Bennett, K., Demiriz, A.: Constrained K-means clustering. Technical Report MSR-TR-2000-65, Microsoft Research (2000)

    Google Scholar 

  6. Cristianini, N., Shawe-Taylor, J., Elisseeff, A., Kandola, J.: On kernel-target alignment. In: Dietterich, T.G., Becker, S., Ghahramani, Z. (eds.) Advances in Neural Information Processing Systems 14, MIT Press, Cambridge (2002)

    Google Scholar 

  7. Duda, R.O., Hart, P.E., Stork, D.G.: Pattern Classification, 2nd edn. John Wiley & Sons, Inc., Chichester (2000)

    Google Scholar 

  8. Fisher, R.A.: The use of multiple measurements in taxonomic problems. Annals of Eugenics 7(Part II), 179–188 (1936)

    Google Scholar 

  9. Hofmann, T.: What people don’t want. In: European Conference on Machine Learning, ECML (2002)

    Google Scholar 

  10. Horn, R.A., Johnson, C.R.: Topics in Matrix Analysis. Cambridge University Press, Cambridge (1991)

    MATH  Google Scholar 

  11. Lanckriet, G., Cristianini, N., Bartlett, P., El Ghaoui, L., Jordan, M.I.: Learning the kernel matrix with semi-definite programming. Technical Report CSD-02-1206, Division of Computer Science, University of California, Berkeley (2002)

    Google Scholar 

  12. Rosipal, R., Trejo, L.J., Matthews, B.: Kernel PLS-SVC for linear and nonlinear classification. In: Proceedings of the Twentieth International Conference on Machine Learning (2003) (to appear)

    Google Scholar 

  13. Vert, J.-P., Kanehisa, M.: Graph-driven features extraction from microarray data using diffusion kernels and cca. In: Advances in Neural Information Processing Systems 15, MIT Press, Cambridge (2003)

    Google Scholar 

  14. Vinokourov, N.C., Shawe-Taylor, J.: Inferring a semantic representation of text via cross-language correlation analysis. In: Advances in Neural Information Processing Systems 15, MIT Press, Cambridge (2003)

    Google Scholar 

  15. Xing, E.P., Ng, A.Y., Jordan, M.I., Russell, S.: Distance metric learning, with application to clustering with side-information. In: Advances in Neural Information Processing Systems 15, MIT Press, Cambridge (2003)

    Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2003 Springer-Verlag Berlin Heidelberg

About this paper

Cite this paper

De Bie, T., Momma, M., Cristianini, N. (2003). Efficiently Learning the Metric with Side-Information. In: Gavaldá, R., Jantke, K.P., Takimoto, E. (eds) Algorithmic Learning Theory. ALT 2003. Lecture Notes in Computer Science(), vol 2842. Springer, Berlin, Heidelberg. https://doi.org/10.1007/978-3-540-39624-6_15

Download citation

  • DOI: https://doi.org/10.1007/978-3-540-39624-6_15

  • Publisher Name: Springer, Berlin, Heidelberg

  • Print ISBN: 978-3-540-20291-2

  • Online ISBN: 978-3-540-39624-6

  • eBook Packages: Springer Book Archive

Publish with us

Policies and ethics