Advertisement

Subspace Learning with an Archive-Based Genetic Algorithm

  • Kai Liu
  • Jin TianEmail author
Conference paper

Abstract

Feature selection is a useful technique to resolve the curse of dimensionality. Feature selection usually chooses the same feature subset for all samples. However, the different divisions of samples in local feature subsets usually have intrinsic properties in complex datasets. Subspace learning is an alternative feature selection approach by generating multiple subspaces for different classes. In this paper we proposed a subspace ensemble method based on an archive-based genetic algorithm. Experimental results show that the proposed method can outperform other conventional ensemble learning algorithms.

Keywords

Feature selection Subspace learning Genetic algorithm Classification 

Notes

Acknowledgements

The work was supported by the General Program of the National Science Foundation of China (Grant No. 71471127, 71371135).

References

  1. 1.
    F. Viegas, L. Rocha, M. Gonçalves et al., A genetic programming approach for feature selection in highly dimensional skewed data. Neurocomputing 554–569 (2018)CrossRefGoogle Scholar
  2. 2.
    Y. Li, T. Li, H. Liu, Recent advances in feature selection and its applications. Knowl. Inf. Syst. 1–27 (2017)Google Scholar
  3. 3.
    P. Yan, Y. Li, Graph-margin based multi-label feature selection, in Machine Learning and Knowledge Discovery in Databases (Springer International Publishing, 2016)Google Scholar
  4. 4.
    B. Seijo-Pardo, I. Porto-Díaz, V. Bolón-Canedo et al., Ensemble feature selection: homogeneous and heterogeneous approaches. Knowl.-Based Syst. (2016)Google Scholar
  5. 5.
    R. Shang, W. Wang, R. Stolkin et al., Subspace learning-based graph regularized feature selection. Knowl.-Based Syst. 112, 152–165 (2016)CrossRefGoogle Scholar
  6. 6.
    S. Sun, C. Zhang, Subspace ensembles for classification. Phys. A 385(1), 199–207 (2007)CrossRefGoogle Scholar
  7. 7.
    M. Woźniak, M. Graña, E. Corchado, A survey of multiple classifier systems as hybrid systems. Inf. Fusion 16(1), 3–17 (2014)CrossRefGoogle Scholar
  8. 8.
    D. Cheng, S. Zhang, X. Liu et al., Feature selection by combining subspace learning with sparse representation. Multimed. Syst. 23(3), 285–291 (2017)CrossRefGoogle Scholar
  9. 9.
    Y. Guo, L. Jiao, S. Wang et al., A novel dynamic rough subspace based selective ensemble. Pattern Recogn. 48(5), 1638–1652 (2015)CrossRefGoogle Scholar
  10. 10.
    A. Schclar, L. Rokach, Random projection ensemble classifiers, in International Conference on Enterprise Information Systems (Springer, Berlin, Heidelberg, 2009), pp. 309–316CrossRefGoogle Scholar
  11. 11.
    P.L. Gutierrez, S. Siva, Classification and regression tree (CART), in Encyclopedia of Genetics, Genomics, Proteomics and Informatics (Springer Netherlands, 2008), pp. 370–370Google Scholar
  12. 12.
    Y.S. Chung, D.F. Hsu, C.Y. Tang, On the diversity-performance relationship for majority voting in classifier ensembles, in International Conference on Multiple Classifier Systems (Springer, 2007), pp. 407–420Google Scholar
  13. 13.
    G. Brown, L.I. Kuncheva, “Good” and “Bad” diversity in majority vote ensembles, in International Conference on Multiple Classifier Systems (Springer, 2010), pp. 124–133Google Scholar
  14. 14.
    E.A. Cortés, M.G. Martínez, N.G. Rubio, Multiclass corporate failure prediction by adaboost.M1. Int. Adv. Econ. Res. 13(3), 301–312 (2007)Google Scholar
  15. 15.
    L. Breiman, Bagging predictors. Mach. Learn. 24(2), 123–140 (1996)Google Scholar
  16. 16.
    L. Breiman, Random forest. Mach. Learn. 45, 5–32 (2001)CrossRefGoogle Scholar
  17. 17.
    M. Skurichina, R.P.W. Duin, Bagging, boosting and the random subspace method for linear classifiers. Pattern Anal. Appl. 5(2), 121–135 (2002)CrossRefGoogle Scholar

Copyright information

© Springer Nature Singapore Pte Ltd. 2019

Authors and Affiliations

  1. 1.College of Management and EconomicsTianjin UniversityTianjinChina

Personalised recommendations