Subspace Learning with an Archive-Based Genetic Algorithm
Feature selection is a useful technique to resolve the curse of dimensionality. Feature selection usually chooses the same feature subset for all samples. However, the different divisions of samples in local feature subsets usually have intrinsic properties in complex datasets. Subspace learning is an alternative feature selection approach by generating multiple subspaces for different classes. In this paper we proposed a subspace ensemble method based on an archive-based genetic algorithm. Experimental results show that the proposed method can outperform other conventional ensemble learning algorithms.
KeywordsFeature selection Subspace learning Genetic algorithm Classification
The work was supported by the General Program of the National Science Foundation of China (Grant No. 71471127, 71371135).
- 2.Y. Li, T. Li, H. Liu, Recent advances in feature selection and its applications. Knowl. Inf. Syst. 1–27 (2017)Google Scholar
- 3.P. Yan, Y. Li, Graph-margin based multi-label feature selection, in Machine Learning and Knowledge Discovery in Databases (Springer International Publishing, 2016)Google Scholar
- 4.B. Seijo-Pardo, I. Porto-Díaz, V. Bolón-Canedo et al., Ensemble feature selection: homogeneous and heterogeneous approaches. Knowl.-Based Syst. (2016)Google Scholar
- 11.P.L. Gutierrez, S. Siva, Classification and regression tree (CART), in Encyclopedia of Genetics, Genomics, Proteomics and Informatics (Springer Netherlands, 2008), pp. 370–370Google Scholar
- 12.Y.S. Chung, D.F. Hsu, C.Y. Tang, On the diversity-performance relationship for majority voting in classifier ensembles, in International Conference on Multiple Classifier Systems (Springer, 2007), pp. 407–420Google Scholar
- 13.G. Brown, L.I. Kuncheva, “Good” and “Bad” diversity in majority vote ensembles, in International Conference on Multiple Classifier Systems (Springer, 2010), pp. 124–133Google Scholar
- 14.E.A. Cortés, M.G. Martínez, N.G. Rubio, Multiclass corporate failure prediction by adaboost.M1. Int. Adv. Econ. Res. 13(3), 301–312 (2007)Google Scholar
- 15.L. Breiman, Bagging predictors. Mach. Learn. 24(2), 123–140 (1996)Google Scholar