Advertisement

Aggregation of multiple metric descriptions from distances between unlabeled objects

Article

Abstract

The situation when there are several different semimetrics on the set of objects in the recognition problem is considered. The problem of aggregating distances based on an unlabeled sample is stated and investigated. In other words, the problem of unsupervised reduction of the dimension of multiple metric descriptions is considered. This problem is reduced to the approximation of the original distances in the form of optimal matrix factorization subject to additional metric constraints. It is proposed to solve this problem exactly using the metric nonnegative matrix factorization. In terms of the problem statement and solution procedure, the metric data method is an analog of the principal component method for feature-oriented descriptions. It is proved that the addition of metric requirements does not decrease the quality of approximation. The operation of the method is demonstrated using toy and real-life examples.

Keywords

multiple metric descriptions multiple metric spaces similarity measures dimension reduction nonnegative matrix factorization (NMF) principal component analysis (PCA) 

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

References

  1. 1.
    T. M. Cover and P. E. Hart, “Nearest neighbor pattern classification,” IEEE Trans. Inf. Theory 13, 21–27 (1967).CrossRefMATHGoogle Scholar
  2. 2.
    E. Parzen, “On estimation of a probability density function and mode,” Annal. Math. Statist. 33, 1065–1076 (1962).MathSciNetCrossRefMATHGoogle Scholar
  3. 3.
    M. A. Aizerman, E. M. Bravermann, and L. I. Rozonoer, “Theoretical foundations of the method of potential functions in training automata to classify input situations,” Avtom. Telemekh. 25, 917–936 (1964).Google Scholar
  4. 4.
    Yu. I. Zhuravlev, “Correct algorithms over sets of incorrect (heuristic) algorithms: Part I,” Kibernetika, No. 4, 5–17 (1977).Google Scholar
  5. 5.
    Yu. I. Zhuravlev, “Correct algorithms over sets of incorrect (heuristic) algorithms: Part II,” Kibernetika, No. 6, 21–27 (1977).MATHGoogle Scholar
  6. 6.
    Yu. I. Zhuravlev, “Correct algorithms over sets of incorrect (heuristic) algorithms: Part III,” Kibernetika, No. 2, 35–43 (1978).MATHGoogle Scholar
  7. 7.
    Yu. I. Zhuravlev and K. V. Rudakov, “On the Algebraic Correction of Information Processing (Transformation) Procedures,” in Problems of Applied Mathematics and Information Science (Nauka, Moscow, 1987), pp. 187–198 [in Russian].Google Scholar
  8. 8.
    Y. Zhang, H. Zhang, N. M. Nasrabadi, and T. S. Huang, “Multi-metric learning for multi-sensor fusion based classification,” Inf. Fusion 14, 431–440 (2013).CrossRefGoogle Scholar
  9. 9.
    F. A. Al-Wassai and N. V. Kalyankar, “The classification accuracy of multiple-metric learning algorithm on multi-sensor fusion,” Int. J. Soft Comput. Eng. 3 (4), 124–131 (2013).Google Scholar
  10. 10.
    I. T. Jolliffe, Principal Component Analysis, 2nd ed. Springer series in statistics (Springer, New York, 2002).MATHGoogle Scholar
  11. 11.
    D. D. Lee and H. S. Seung, “Learning the parts of objects by non-negative matrix factorization,” Nature 401, 788–791 (1999).CrossRefGoogle Scholar
  12. 12.
    N. A. Naidenov, “Investigation of a metric analog of the principal component method,” in Sbornic statei molodykh uchenykh facul’teta VMK MGU (Collection of papers of young researchers of the Division of Computational Mathematics and Cybernetics, Moscow State University), 2010, No. 7, pp. 60–69.Google Scholar
  13. 13.
    A. I. Maysuradze and M. A. Suvorov, “Learning of a linear combination of metric on a finite sample,” Probl. Teor. Kibern., No. 17, 186–189 (2014).Google Scholar
  14. 14.
    J. Goldberger, S. Roweis, G. Hinton, and R. Salakhutdinov, “Neighbourhood component analysis,” Adv. Neural Inf. Proc. Syst., No. 17, 513–520 (2005).Google Scholar
  15. 15.
    K. Q. Weinberger, J. Blitzer, and L. K. Saul, “Distance metric learning for large margin nearest neighbor classification,” Adv. Neural Inf. Proc. Syst., No. 18, 1473–1480 (2005).Google Scholar
  16. 16.
    H. Zheng, M. Wang, and Z. Li, “Audio-visual speaker identification with multi-view distance metric learning,” in Proc. IEEE 17th Int. Conf. on Image Processing, 2010, pp. 4561–4564.Google Scholar
  17. 17.
    B. Wang, L. Jiang, W. W. Wang, Z.-H. Zhou, and Z. Tu, “Unsupervised metric fusion by cross diffusion,” in IEEE Conf. on Computer Vision and Pattern Recognition, 2012, pp. 2997–3004.Google Scholar
  18. 18.
    D. Zhai, H. Chang, S. Shan, X. Chen, and W. Gao, “Multiview metric learning with global consistency and local smoothness,” ACM Trans. Intell. Syst. Technol. 3 (3), 1–22 (2012).CrossRefGoogle Scholar
  19. 19.
    X. Bai, B. Wang, X. Wang, W Liu, and Z. Tu, “Co-transduction for shape retrieval,” in Proc. of the 11th European Conf. on Computer Vision: Part III (ECCV’10), 2010, pp. 328–341.Google Scholar
  20. 20.
    M. Gönen and E. Alpaydin, “Multiple Kernel Learning Algorithms,” J. Mach. Learning Res. 12, 2211–2268 (2011).MathSciNetMATHGoogle Scholar
  21. 21.
    M. D. Buhmann, Radial Basis Functions: Theory and Implementations (Cambridge Univ. Press, New York, 2003).CrossRefMATHGoogle Scholar
  22. 22.
    L. Belanche, J. L. Vázquez, and M. Vázquez, “Distance-based kernels for real-valued data,” in Data Analysis, Machine Learning and Applications (Springer, Berlin, 2008), pp. 3–10.CrossRefGoogle Scholar
  23. 23.
    B. Bekka, P. de la Harpe, and A. Valette, Kazhdan’s Property (T) (Cambridge Univ. Press, New York, 2008).CrossRefMATHGoogle Scholar
  24. 24.
    I. J. Schoenberg, “Metric spaces and positive definite functions,” Trans. Am. Math. Soc. 44, 522–536 (1938).MathSciNetCrossRefMATHGoogle Scholar
  25. 25.
    A. I. Maysuradze, “Optimal decompositions of finite metric configurations in pattern recognition,” Comput. Math. Math. Phys. 44, 1615–1624 (2004).MathSciNetGoogle Scholar
  26. 26.
    A. Cichocki, R. Zdunek, A. H. Phan, and S.-I. Amari, Nonnegative Matrix and Tensor Factorizations (Wiley, Chichester, 2009).CrossRefGoogle Scholar
  27. 27.
    A. I. Maysuradze, “Homogeneous and rank bases in spaces of metric configurations,” Comput. Math. Math. Phys. 46, 330–344 (2006).MathSciNetCrossRefGoogle Scholar
  28. 28.
    C. Meyer, Matrix Analysis and Applied Linear Algebra (SIAM, Philadelphia, 2000).CrossRefGoogle Scholar
  29. 29.
    L. Mestetskiy, “Shape comparison of flexible objects—similarity of palm silhouettes,” in Proc. Second Int. Conf. on Computer Vision Theory and Applications (VISAPP 2007), 2007, Vol. IFP/IA, pp. 390–393.Google Scholar
  30. 30.
    I. Bakina, A. Kurakin, and L. Mestetskiy, “Hand geometry analysis by continuous skeletons,” Lecture Notes Comput. Sci. 6753, 130–139 (2011).MathSciNetGoogle Scholar

Copyright information

© Pleiades Publishing, Ltd. 2017

Authors and Affiliations

  1. 1.Faculty of Computational Mathematics and CyberneticsMoscow State UniversityMoscowRussia

Personalised recommendations