Abstract
This paper deals with problems which require the reconstruction of structures of multi-dimensional dependencies from data. From a mathematical point of view these problems belong to the most complicated problems of artificial intelligence, such as reconstruction of the structures of multi-dimensional regressions or difference equations. In classification we meet similar problems when we have to select the space of classification features. Here we consider a special problem of supervisor-based classification that can be solved by the classification method “Alpha-procedure”. This problem consists in the following: Normally, the construction of the separating rule is performed during a training phase, where a supervisor defines the belonging of objects to classes by using a training set of data that can be small. But obviously the rich (partly subconscious) experience of the supervisor which is not described quantitatively somehow influences his decision. This may concern the importance, the uselessness, or even the harmfulness of the features. By this reason, the construction of the separation rule directly in the Euclidian data space leads to instability of that rule in certain cases. The paper explains why the Alpha-procedure, that performs an inductive construction of the separating rule in the homogeneous Lorentz space, allows a stable classification of new objects in the application phase without supervisor. It also shows, from the point of view of group transformations and their invariants, the difference between the mathematical apparatus for the search of the decision rule in a fixed feature space and in a space that is constructed by selecting features.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
Notes
- 1.
In the sense of the minimum of the intersection of objects belonging to different classes
- 2.
see footnote 1
- 3.
The totality of all lines and (two-dimensional) planes of a space which intersect a given point \( S \) of the space is called projective bundle of the lines and planes with the centre \( S \).
- 4.
The coefficient \( c = c_{1} = c_{2} = \ldots = c_{n} \) may be the speed of the light (taken with “minus”) in the special theory of relativity.
References
Akaike, H.: Experiences on development of time series models. In: Bozdogan, H. (ed.) Proceedings of the First US/Japan Conference on the Frontiers of Statistical Modeling: An Information Approach, vol. 1, pp. 33–42. Kluwer Academic Publishers, Dordrecht (1994)
Lange, T.: New structure criteria in GMDH. In: Bozdogan, H. (ed.) Proceedings of the First US/Japan Conference on the Frontiers of Statistical Modeling: An Information Approach, vol. 3, pp. 249–266. Kluwer Academic Publishers, Dordrecht (1994)
Madala, H.R., Ivakhnenko, A.G.: Inductive Learning Algorithms for Complex Systems Modeling. CRC Press, Boca Raton (1994)
Stepashko, V.S.: Method of critical variances as analytical tool of theory of inductive modeling. J. Autom. Inf. Sci. 40(3), 4–22 (2008)
Vassilev, V.I.: The reduction principle in pattern recognition learning (PRL) problem. Pattern Recogn. Image Anal. 1(1), 23–32 (1991)
Vassilev, V.I., Lange, T.: The principle of duality within the training problem during pattern recognition. Cybern. Comput. Eng. 121, 7–16 (1998). (in Russian)
Vassilev, V.I., Lange, T., Baranoff, A.E.: Interpretation of diffuse terms. In: Proceedings of the VIII. International Conference KDS 1999, pp. 183–187 (1999). (in Russian) Kaziveli (Krimea, Ukraine)
Vassilev, V.I., Lange,T.: Reduction theory for identification tasks. In: Proceedings of International Conference on Control, Automatics-2000, 11–15 September 2000, Section 2, Lviv, pp. 49–53 (2000). (in Russian)
Lange, T.: The alpha-procedure as an inductive approach to pattern recognition and its connection with lorentz transformation. In: Shakhovska, N., Stepashko, V. (eds.) Advances in Intelligent Systems and Computing II. CSIT 2017. Advances in Intelligent Systems and Computing, vol. 689. pp. 280–299. Springer, Cham (2018)
Vapnik, V., Chervonenkis, A.Ya.: The Theory of Pattern Recognition. Nauka, Moscow (1974)
Cortes, C., Vapnik, V.: Support-vector networks. Mach. Learn. 20, 273–297 (1995)
Aizerman, M.A., Braverman, E.M., Rozonoer, L.I.: The Method of Potential Functions in the Theory of Machine Learning. Nauka, Moscow (1970). (in Russian)
Tikhonov, A.N., Arsenin, V.Ya.: Methods of Solving Incorrect Tasks. Nauka, Moscow (1974). (in Russian)
Hadamard, J.: Sur les problèmes aux dérivées partielles et leur signification physique. Bull. Univ. Princeton 13, 49–52 (1902)
Mukhin, V., Volokyta, A., Heriatovych, Y., Rehida, P.: Method for efficiency increasing of distributed classification of the images based on the proactive parallel computing approach. Adv. Electr. Comput. Eng. 18, 117–122 (2018)
Lange, T.: Transformation of the Euclidian data space into a homogeneous event space for the inductive construction of classification rules. In: Proceedings of International Scientific Conference “Computer Sciences and Information Technologies” (CSOI-2019), vol. 1, pp. 173–178. IEEE (2019)
Author information
Authors and Affiliations
Corresponding author
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2020 Springer Nature Switzerland AG
About this paper
Cite this paper
Lange, T. (2020). Homogeneous Space as Media for the Inductive Selection of Separating Features for the Construction of Classification Rules. In: Shakhovska, N., Medykovskyy, M.O. (eds) Advances in Intelligent Systems and Computing IV. CSIT 2019. Advances in Intelligent Systems and Computing, vol 1080. Springer, Cham. https://doi.org/10.1007/978-3-030-33695-0_34
Download citation
DOI: https://doi.org/10.1007/978-3-030-33695-0_34
Published:
Publisher Name: Springer, Cham
Print ISBN: 978-3-030-33694-3
Online ISBN: 978-3-030-33695-0
eBook Packages: Intelligent Technologies and RoboticsIntelligent Technologies and Robotics (R0)