Skip to main content
Log in

Making metric learning algorithms invariant to transformations using a projection metric on Grassmann manifolds

  • Original Article
  • Published:
International Journal of Machine Learning and Cybernetics Aims and scope Submit manuscript

Abstract

The requirement for suitable ways to measure the distance or similarity between data is omnipresent in machine learning, pattern recognition and data mining, but extracting such good metrics for particular problems is in general challenging. This has led to the emergence of metric learning ideas, which intend to automatically learn a distance function tuned to a specific task. In many tasks and data types, there are natural transformations to which the classification result should be invariant or insensitive. This demand and its implications are essential in many machine learning applications, and insensitivity to image transformations was in the first place achieved by using invariant feature vectors. In this paper, a new representation model on Grassmann manifolds for data points and a novel method for learning a Mahalanobis metric which uses the geodesic distance on Grassmann manifolds are proposed. In fact, we use an appropriate geodesic distance metric on the Grassmann manifolds, called projection metric, for measuring primary similarities between the new representations of the data points. This makes learning of the Mahalanobis metric invariant to similarity transforms and intensity changes, and therefore improve the performance. Experiments on face and handwritten digit datasets demonstrate that our proposed method yields performance improvements in a state-of-the-art metric learning algorithm.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Institutional subscriptions

Fig. 1
Fig. 2
Fig. 3
Fig. 4
Fig. 5

Similar content being viewed by others

Notes

  1. http://www.cad.zju.edu.cn/home/dengcai/Data/FaceData.html.

  2. http://www.cad.zju.edu.cn/home/dengcai/Data/FaceData.html.

  3. http://www.csie.ntu.edu.tw/~cjlin/libsvmtools/datasets/multiclass.html.

  4. http://yann.lecun.com/exdb/mnist/.

References

  1. Absil P-A, Mahony R, Sepulchre R (2009) Optimization algorithms on matrix manifolds. Princeton University Press, Princeton

    MATH  Google Scholar 

  2. Bar-Hillel A, Hertz T, Shental N, Weinshall D (2005) Learning a Mahalanobis metric from equivalence constraints. J Mach Learn Res 6:937–965

    MathSciNet  MATH  Google Scholar 

  3. Bi Y, Fan B, Wu F (2015) Beyond Mahalanobis metric: Cayley-Klein metric learning. In: Proceedings of the IEEE conference on computer vision and pattern recognition, pp 2339–2347

  4. Candemir S, Borovikov E, Santosh KC, Antani S, Thoma G (2015) Rsilc: rotation-and scale-invariant, line-based color-aware descriptor. Image Vis Comput 42:1–12

    Article  Google Scholar 

  5. Crammer K, Singer Y (2001) On the algorithmic implementation of multiclass kernel-based vector machines. J Mach Learn Res 2(Dec):265–292

    MATH  Google Scholar 

  6. Davis JV, Kulis B, Jain P, Sra S, Dhillon IS (2007) Information-theoretic metric learning. In: Proceedings of the 24th international conference on machine learning, ACM, pp 209–216.

  7. Domeniconi C, Gunopulos D (2002) Adaptive nearest neighbor classification using support vector machines. In: Advances in neural information processing systems, pp 665–672

  8. Duda RO, Hart PE, Stork DG (2012) Pattern classification. Wiley, New York

    MATH  Google Scholar 

  9. Edelman A, Arias TA, Smith ST (1998) The geometry of algorithms with orthogonality constraints. SIAM J Matrix Anal Appl 20(2):303–353

    Article  MathSciNet  Google Scholar 

  10. Goldberger J, Hinton GE, Roweis ST, Salakhutdinov RR (2005) Neighbourhood components analysis. In: Advances in neural information processing systems, pp 513–520

  11. Guillaumin M, Verbeek J, Schmid C (2009) Is that you? Metric learning approaches for face identification. In: 2009 IEEE 12th international conference on computer Vision, pp 498–505. IEEE

  12. Halmos PR (2012) A Hilbert space problem book, vol 19. Springer Science & Business Media, New York

    MATH  Google Scholar 

  13. Hamm J, Lee DD (2008) Grassmann discriminant analysis: a unifying view on subspace-based learning. In: Proceedings of the 25th international conference on machine learning, pp 376–383. ACM

  14. Harandi M, Sanderson C, Shen C, Lovell BC (2013) Dictionary learning and sparse coding on Grassmann manifolds: An extrinsic solution. In: Proceedings of the IEEE international conference on computer vision, pp 3120–3127

  15. Harandi MT, Salzmann M, Jayasumana S, Hartley R, Li H (2014) Expanding the family of Grassmannian kernels: an embedding perspective. In: European conference on computer vision, Springer, pp 408–423.

  16. Helmke U, Hüper K, Trumpf J (2007) Newton’s method on Grassmann manifolds. arXiv preprint arXiv:0709.2205

  17. Hochuli AG, Oliveira LS, Britto AS Jr, Sabourin R (2018) Handwritten digit segmentation: is it still necessary? Pattern Recognit 78:1–11

    Article  Google Scholar 

  18. Hoi SCH, Liu W, Chang S-F (2010) Semi-supervised distance metric learning for collaborative image retrieval and clustering. ACM Trans Multimed Comput Commun Appl (TOMM) 6(3):18

    Google Scholar 

  19. Huang Z, Wang R, Shan S, Chen X (2015) Projection metric learning on Grassmann manifold with application to video based face recognition. In: Proceedings of the IEEE conference on computer vision and pattern recognition, pp 140–149

  20. Koestinger M, Hirzer M, Wohlhart P, Roth PM, Bischof H (2012) Large scale metric learning from equivalence constraints. In: 2012 IEEE conference on computer vision and pattern recognition (CVPR) , pp 2288–2295. IEEE

  21. Le H (1991) On geodesics in Euclidean shape spaces. J Lond Math Soc 2(2):360–372

    Article  MathSciNet  Google Scholar 

  22. Lim D, Lanckriet G (2014) Efficient learning of Mahalanobis metrics for ranking. In: Proceedings of the 31st international conference on machine learning (ICML-14), pp 1980–1988

  23. Patil S, Talbar S (2012) Content based image retrieval using various distance metrics. In: Data engineering and management, Springer, pp 154–161

  24. Peng J, Heisterkamp DR, Dai HK (2002) Adaptive kernel metric nearest neighbor classification. In: Proceedings of 16th international conference on pattern recognition, 2002, vol 3, pp 33–36. IEEE

  25. Santosh KC (2011) Character recognition based on dtw-radon. In: 2011 international conference on document analysis and recognition (ICDAR), pp 264–268. IEEE

  26. Santosh KC, Aafaque A, Antani S, Thoma GR (2017) Line segment-based stitched multipanel figure separation for effective biomedical cbir. Int J Pattern Recognit Artif Intell 31(06):1757003

    Article  Google Scholar 

  27. Santosh KC, Lamiroy B, Wendling L (2014) Integrating vocabulary clustering with spatial relations for symbol recognition. Int J Doc Anal Recognit (IJDAR) 17(1):61–78

    Article  Google Scholar 

  28. Santosh KC, Roy PP (2018) Arrow detection in biomedical images using sequential classifier. Int J Mach Learn Cybern 9(6):993–1006

    Article  Google Scholar 

  29. Santosh KC, Wendling L, Antani S, Thoma GR (2016) Overlaid arrow detection for labeling regions of interest in biomedical images. IEEE Intell Syst 31(3):66–75

    Article  Google Scholar 

  30. Sethi S, Rohila VK, Agarwal P (2018) Hand written and natural scene character recognition. Hand 5(05):908–911

    Google Scholar 

  31. Shen C, Kim J, Wang L, Hengel AVD (2012) Positive semidefinite metric learning using boosting-like algorithms. J Mach Learn Res 13:1007–1036

    MathSciNet  MATH  Google Scholar 

  32. Srivastava A, Klassen E (2004) Bayesian and geometric subspace tracking. Adv Appl Probab 36(01):43–56

    Article  MathSciNet  Google Scholar 

  33. Vajda S, Santosh KC (2016) A fast k-nearest neighbor classifier using unsupervised clustering. In: International conference on recent trends in image processing and pattern recognition, Springer, pp 185–193.

  34. Wang D, Tan X (2018) Robust distance metric learning via bayesian inference. IEEE Trans Image Process 27(3):1542–1553

    Article  MathSciNet  Google Scholar 

  35. Wang J, Kalousis A, Woznica A (2012) Parametric local metric learning for nearest neighbor classification. In: Advances in neural information processing systems, pp 1601–1609

  36. Weinberger KQ, Saul LK (2009) Distance metric learning for large margin nearest neighbor classification. J Mach Learn Res 10(Feb):207–244

    MATH  Google Scholar 

  37. Xiang S, Nie F, Zhang C (2008) Learning a Mahalanobis distance metric for data clustering and classification. Pattern Recognit 41(12):3600–3612

    Article  Google Scholar 

  38. Ying Y, Li P (2012) Distance metric learning with eigenvalue optimization. J Mach Learn Res 13:1–26

    MathSciNet  MATH  Google Scholar 

  39. Zadeh P, Hosseini R, Sra S (2016) Geometric mean metric learning. In: International conference on machine learning, pp 2464–2471

  40. Zuo W, Wang F, Zhang D, Lin L, Huang Y, Meng D, Zhang L (2015) Iterated support vector machines for distance metric learning. arXiv preprint arXiv:1502.00363

Download references

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Peyman Adibi.

Additional information

Publisher’s Note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Rights and permissions

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Goudarzi, Z., Adibi, P., Grigat, RR. et al. Making metric learning algorithms invariant to transformations using a projection metric on Grassmann manifolds. Int. J. Mach. Learn. & Cyber. 10, 3407–3416 (2019). https://doi.org/10.1007/s13042-019-00927-4

Download citation

  • Received:

  • Accepted:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s13042-019-00927-4

Keywords

Navigation