Abstract
Multi-task learning has been widely studied in machine learning due to its capability to improve the performance of multiple related learning problems. However, few researchers have applied it on the important metric learning problem. In this paper, we propose to couple multiple related metric learning tasks with von Neumann divergence. On one hand, the novel regularized approach extends previous methods from the vector regularization to a general matrix regularization framework; on the other hand and more importantly, by exploiting von Neumann divergence as the regularizer, the new multi-task metric learning has the capability to well preserve the data geometry. This leads to more appropriate propagation of side-information among tasks and provides potential for further improving the performance. We propose the concept of geometry preserving probability (PG) and show that our framework leads to a larger PG in theory. In addition, our formulation proves to be jointly convex and the global optimal solution can be guaranteed. A series of experiments across very different disciplines verify that our proposed algorithm can consistently outperform the current methods.
Chapter PDF
Similar content being viewed by others
References
Banerjee, A., Merugu, S., Dhillon, I.S., Ghosh, J.: Clustering with bregman divergences. Journal of Machine Learning Research 6, 1705–1749 (2005)
Burago, D., Burago, Y., Ivanov, S.: A Course in Metric Geometry. American Mathematical Society (June 2001)
Dhillon, I.S., Tropp, J.A.: Matrix nearness problems with bregman divergences. SIAM Journal on Matrix Analysis and Applications 29, 1120–1146 (2008)
Evgeniou, T., Micchelli, C.A., Pontil, M.: Learning multiple tasks with kernel methods. Journal of Machine Learning Research 6, 615–637 (2005)
Haber, H.E.: The volume and surface area of ann-dimensional hypersphere (2011), http://scipp.ucsc.edu/~haber/ph116A/volume_11.pdf
Huang, K., Ying, Y., Campbell, C.: Generalized sparse metric learning with relative comparisons. Knowledge and Information Systems 28(1), 25–45 (2011)
Lindblad, G.: Completely positive maps and entropy inequalities. Commun. Math. Phys. 40(2), 147–151 (1975)
Parameswaran, S., Weinberger, K.: Large margin multi-task metric learning. In: Advances in Neural Information Processing Systems 23, pp. 1867–1875 (2010)
Tropp, J.A.: From joint convexity of quantum relative entropy to a concavity theorem of lieb. Proceedings of the American Mathematical Society 140, 1757–1760 (2012)
Weinberger, K.Q., Saul, L.K.: Distance metric learning for large margin nearest neighbor classification. Journal of Machine Learning Research 10, 207–244 (2009)
Yang, P., Huang, K., Liu, C.L.: A multi-task framework for metric learning with common subspace. Neural Computing and Applications, 1–11 (2012)
Zhang, Y., Yeung, D.Y.: Transfer metric learning by learning task relationships. In: Proceedings of the tenth ACM SIGKDD International Conference on Knowledge Discovery and Data Mining (2010)
Author information
Authors and Affiliations
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2012 Springer-Verlag Berlin Heidelberg
About this paper
Cite this paper
Yang, P., Huang, K., Liu, CL. (2012). Geometry Preserving Multi-task Metric Learning. In: Flach, P.A., De Bie, T., Cristianini, N. (eds) Machine Learning and Knowledge Discovery in Databases. ECML PKDD 2012. Lecture Notes in Computer Science(), vol 7523. Springer, Berlin, Heidelberg. https://doi.org/10.1007/978-3-642-33460-3_47
Download citation
DOI: https://doi.org/10.1007/978-3-642-33460-3_47
Publisher Name: Springer, Berlin, Heidelberg
Print ISBN: 978-3-642-33459-7
Online ISBN: 978-3-642-33460-3
eBook Packages: Computer ScienceComputer Science (R0)