Pattern Recognition and Image Analysis

, Volume 28, Issue 4, pp 712–719 | Cite as

Multidimensional Data Visualization Based on the Minimum Distance Between Convex Hulls of Classes

  • A. P. NemirkoEmail author
Mathematical Method in Pattern Recognition


The problem of data visualization in the analysis of two classes in a multidimensional feature space is considered. The two orthogonal axes by which the classes are maximally separated from each other are found in the mapping of classes as a result of linear transformation of coordinates. The proximity of the classes is estimated based on the minimum-distance criterion between their convex hulls. This criterion makes it possible to show cases of full class separability and random outliers. A support vector machine is used to obtain orthogonal vectors of the reduced space. This method ensures the obtaining of the weight vector that determines the minimum distance between the convex hulls of classes for linearly separable classes. Algorithms with reduction, contraction, and offset of convex hulls are used for intersecting classes. Experimental studies are devoted to the application of the considered visualization methods to biomedical data analysis.


multidimensional data visualization machine learning support vector machine biomedical data analysis 


Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.


  1. 1.
    I. T. Jolliffe, Principal Component Analysis, 2nd ed. (Springer–Verlag, New York, 2002).zbMATHGoogle Scholar
  2. 2.
    A. P. Nemirko, “Transformation of feature space based on Fisher’s linear discriminant,” Pattern Recogn. Image Anal. 26 (2), 257–261 (2016). doi 10.1134/S1054661816020127CrossRefGoogle Scholar
  3. 3.
    L. A. Manilo and A. P. Nemirko, “Recognition of biomedical signals based on their spectral description data analysis,” Pattern Recogn. Image Anal. 26 (4), 782–788 (2016). doi 10.1134/S1054661816040088CrossRefGoogle Scholar
  4. 4.
    T. Maszczyk and W. Duch, “Support vector machines for visualization and dimensionality reduction,” in Artificial Neural Networks — ICANN 2008, Ed. by V. Kůrková, R. Neruda, and J. Koutník, Lecture Notes in Computer Science (Springer, Berlin, Heidelberg, 2008), Vol. 5163, pp. 346–356.CrossRefGoogle Scholar
  5. 5.
    R. O. Duda, P. E. Hart, and D. G. Stork, Pattern Classification, Part 1 (Wiley, New York, 2001).zbMATHGoogle Scholar
  6. 6.
    C. Cortes and V. N. Vapnik, “Support–vector networks,” Mach. Learn. 20 (3), 273–297 (1995). doi 10.1023/A:1022627411411zbMATHGoogle Scholar
  7. 7.
    V. N. Vapnik, Statistical Learning Theory (Wiley, New York, 1998).zbMATHGoogle Scholar
  8. 8.
    K. P. Bennett and E. J. Bredensteiner, “Duality and geometry in SVM classifiers,” in Proc. 17th Int. Conf. on Machine Learning (ICML'00) (Morgan Kaufmann, San Francisco, 2000), pp. 57–64.Google Scholar
  9. 9.
    V. Franc and V. Hlaváč, “An iterative algorithm learning the maximal margin classifier,” Pattern Recognit., 36 (9), 1985–1996 (2003).CrossRefzbMATHGoogle Scholar
  10. 10.
    B. F. Mitchell, V. F. Demyanov, and V. N. Malozemov, “Finding the point of a polyhedron closest to the origin,” SIAM J. Control 12 (1), 19–26 (1974).MathSciNetCrossRefzbMATHGoogle Scholar
  11. 11.
    R. Weller, New Geometric Data Structures for Collision Detection and Haptics, in Springer Series on Touch and Haptic Systems (Springer, Heidelberg, 2013). doi 10.1007/978–3–319–01020–5CrossRefGoogle Scholar
  12. 12.
    M. C. Lin, D. Manocha, and Y. J. Kim, “Collision and proximity queries”, in Handbook of Discrete and Computational Geometry, Ed. by J. E. Goodman, J. O’Rourke, and C. D. Tóth, 3rd ed. (CRC Press, Boca Raton, FL, 2018), pp. 1029–1056.Google Scholar
  13. 13.
    M. E. Mavroforakis and S. Theodoridis, “A geometric approach to Support Vector Machine (SVM) classification,” IEEE Trans. Neural Netw. 17 (3), 671–682 (2006).CrossRefGoogle Scholar
  14. 14.
    Z. Liu, J. G. Liu, C. Pan, and G. Wang. “A novel geometric approach to binary classification based on scaled convex hulls,” IEEE Trans. Neural Netw. 20 (7) 1215–1220 (2009).Google Scholar
  15. 15.
    D. J. Crisp and C. J. C. Burges, “A geometric interpretation of ν–SVM classifiers,” in Advances in Neural Information Processing Systems 12 (NIPS 1999) (MIT Press, Cambridge, MA, 1999), pp. 244–250.Google Scholar
  16. 16.
    Q. Tao, G. W. Wu, and J. Wang, “A general soft method for learning SVM classifiers with–norm penalty,” Pattern Recogn., 41 (3), 939–948 (2008). L1CrossRefzbMATHGoogle Scholar
  17. 17.
    Iris Data Set. UCI Machine Learning Repository. Available at: (accessed April 2018).Google Scholar
  18. 18.
    A. P. Nemirko, “Computer geometry algorithms in feature space dimension reduction problems,” Pattern Recogn. Image Anal. 27 (3), 387–394 (2017).CrossRefGoogle Scholar
  19. 19.
    S. S. Keerthi, S. K. Shevade, C. Bhattacharyya, and K. R. K. Murthy, Improvements to Platt’s SMO Algorithm for SVM Classifier Design, Technical Report CD–99–14, Control Division, Dept. of Mechanical and Production Engineering, National University of Singapore, 1999. Available at: http://citeseerx. (accessed May 2018).zbMATHGoogle Scholar
  20. 20.
    S. Theodoridis and K. Koutroumbas Pattern Recognition, 4th ed. (Academic Press, 2009).zbMATHGoogle Scholar
  21. 21.
    Breast Tissue Data Set. UCI Machine Learning Repository. Available at: (accessed April 2018).Google Scholar
  22. 22.
    Breast Cancer Wisconsin (Original) Data Set. UCI Machine Learning Repository. Available at: wisconsin+(original) (accessed May 2018).Google Scholar

Copyright information

© Pleiades Publishing, Ltd. 2018

Authors and Affiliations

  1. 1.St. Petersburg Electrotechnical University “LETI”St. PetersburgRussia

Personalised recommendations