Advertisement

Homogeneous Space as Media for the Inductive Selection of Separating Features for the Construction of Classification Rules

  • Tatjana LangeEmail author
Conference paper
Part of the Advances in Intelligent Systems and Computing book series (AISC, volume 1080)

Abstract

This paper deals with problems which require the reconstruction of structures of multi-dimensional dependencies from data. From a mathematical point of view these problems belong to the most complicated problems of artificial intelligence, such as reconstruction of the structures of multi-dimensional regressions or difference equations. In classification we meet similar problems when we have to select the space of classification features. Here we consider a special problem of supervisor-based classification that can be solved by the classification method “Alpha-procedure”. This problem consists in the following: Normally, the construction of the separating rule is performed during a training phase, where a supervisor defines the belonging of objects to classes by using a training set of data that can be small. But obviously the rich (partly subconscious) experience of the supervisor which is not described quantitatively somehow influences his decision. This may concern the importance, the uselessness, or even the harmfulness of the features. By this reason, the construction of the separation rule directly in the Euclidian data space leads to instability of that rule in certain cases. The paper explains why the Alpha-procedure, that performs an inductive construction of the separating rule in the homogeneous Lorentz space, allows a stable classification of new objects in the application phase without supervisor. It also shows, from the point of view of group transformations and their invariants, the difference between the mathematical apparatus for the search of the decision rule in a fixed feature space and in a space that is constructed by selecting features.

Keywords

Classification Pattern recognition Homogeneous Lorentz space Transformation groups Invariant Alpha-procedure Inductive selection 

References

  1. 1.
    Akaike, H.: Experiences on development of time series models. In: Bozdogan, H. (ed.) Proceedings of the First US/Japan Conference on the Frontiers of Statistical Modeling: An Information Approach, vol. 1, pp. 33–42. Kluwer Academic Publishers, Dordrecht (1994)Google Scholar
  2. 2.
    Lange, T.: New structure criteria in GMDH. In: Bozdogan, H. (ed.) Proceedings of the First US/Japan Conference on the Frontiers of Statistical Modeling: An Information Approach, vol. 3, pp. 249–266. Kluwer Academic Publishers, Dordrecht (1994)Google Scholar
  3. 3.
    Madala, H.R., Ivakhnenko, A.G.: Inductive Learning Algorithms for Complex Systems Modeling. CRC Press, Boca Raton (1994)zbMATHGoogle Scholar
  4. 4.
    Stepashko, V.S.: Method of critical variances as analytical tool of theory of inductive modeling. J. Autom. Inf. Sci. 40(3), 4–22 (2008)MathSciNetCrossRefGoogle Scholar
  5. 5.
    Vassilev, V.I.: The reduction principle in pattern recognition learning (PRL) problem. Pattern Recogn. Image Anal. 1(1), 23–32 (1991)Google Scholar
  6. 6.
    Vassilev, V.I., Lange, T.: The principle of duality within the training problem during pattern recognition. Cybern. Comput. Eng. 121, 7–16 (1998). (in Russian)Google Scholar
  7. 7.
    Vassilev, V.I., Lange, T., Baranoff, A.E.: Interpretation of diffuse terms. In: Proceedings of the VIII. International Conference KDS 1999, pp. 183–187 (1999). (in Russian) Kaziveli (Krimea, Ukraine)Google Scholar
  8. 8.
    Vassilev, V.I., Lange,T.: Reduction theory for identification tasks. In: Proceedings of International Conference on Control, Automatics-2000, 11–15 September 2000, Section 2, Lviv, pp. 49–53 (2000). (in Russian)Google Scholar
  9. 9.
    Lange, T.: The alpha-procedure as an inductive approach to pattern recognition and its connection with lorentz transformation. In: Shakhovska, N., Stepashko, V. (eds.) Advances in Intelligent Systems and Computing II. CSIT 2017. Advances in Intelligent Systems and Computing, vol. 689. pp. 280–299. Springer, Cham (2018)Google Scholar
  10. 10.
    Vapnik, V., Chervonenkis, A.Ya.: The Theory of Pattern Recognition. Nauka, Moscow (1974)Google Scholar
  11. 11.
    Cortes, C., Vapnik, V.: Support-vector networks. Mach. Learn. 20, 273–297 (1995)zbMATHGoogle Scholar
  12. 12.
    Aizerman, M.A., Braverman, E.M., Rozonoer, L.I.: The Method of Potential Functions in the Theory of Machine Learning. Nauka, Moscow (1970). (in Russian)zbMATHGoogle Scholar
  13. 13.
    Tikhonov, A.N., Arsenin, V.Ya.: Methods of Solving Incorrect Tasks. Nauka, Moscow (1974). (in Russian)Google Scholar
  14. 14.
    Hadamard, J.: Sur les problèmes aux dérivées partielles et leur signification physique. Bull. Univ. Princeton 13, 49–52 (1902)Google Scholar
  15. 15.
    Mukhin, V., Volokyta, A., Heriatovych, Y., Rehida, P.: Method for efficiency increasing of distributed classification of the images based on the proactive parallel computing approach. Adv. Electr. Comput. Eng. 18, 117–122 (2018)CrossRefGoogle Scholar
  16. 16.
    Lange, T.: Transformation of the Euclidian data space into a homogeneous event space for the inductive construction of classification rules. In: Proceedings of International Scientific Conference “Computer Sciences and Information Technologies” (CSOI-2019), vol. 1, pp. 173–178. IEEE (2019)Google Scholar

Copyright information

© Springer Nature Switzerland AG 2020

Authors and Affiliations

  1. 1.University of Applied SciencesMerseburgGermany

Personalised recommendations