Ranking Based Unsupervised Feature Selection Methods: An Empirical Comparative Study in High Dimensional Datasets

  • Saúl Solorio-FernándezEmail author
  • J. Ariel Carrasco-Ochoa
  • José Fco. Martínez-Trinidad
Conference paper
Part of the Lecture Notes in Computer Science book series (LNCS, volume 11288)


Unsupervised Feature Selection methods have raised considerable interest in the scientific community due to their capability of identifying and selecting relevant features in unlabeled data. In this paper, we evaluate and compare seven of the most widely used and outstanding ranking based unsupervised feature selection methods of the state-of-the-art, which belong to the filter approach. Our study was made on 25 high dimensional real-world datasets taken from the ASU Feature Selection Repository. From our experiments, we conclude which methods perform significantly better in terms of quality of selection and runtime.


Unsupervised feature selection Filter methods Feature ranking 



The first author gratefully acknowledges to the National Council of Science and Technology of Mexico (CONACyT) for his Ph.D. fellowship, through the scholarship 428478.


  1. 1.
    Dy, J.G., Brodley, C.E.: Feature selection for unsupervised learning. J. Mach. Learn. Res. 5, 845–889 (2004)MathSciNetzbMATHGoogle Scholar
  2. 2.
    Alelyani, S., Tang, J., Liu, H.: Feature selection for clustering: a review. Data Clust.: Algorithms Appl. 29, 110–121 (2013)Google Scholar
  3. 3.
    Dash, M., Liu, H., Yao, J.: Dimensionality reduction of unsupervised data. In: Proceedings Ninth IEEE International Conference on Tools with Artificial Intelligence, pp. 532–539. IEEE Computer Society (1997)Google Scholar
  4. 4.
    Mitra, P., Murthy, C.A., Pal, S.K.: Unsupervised feature selection using feature similarity. IEEE Trans. Pattern Anal. Mach. Intell. PAMI 24(3), 301–312 (2002)CrossRefGoogle Scholar
  5. 5.
    He, X., Cai, D., Niyogi, P.: Laplacian score for feature selection. In: Advances in Neural Information Processing Systems 18, vol. 186, pp. 507–514 (2005)Google Scholar
  6. 6.
    Varshavsky, R., Gottlieb, A., Linial, M., Horn, D.: Novel unsupervised feature filtering of biological data. Bioinformatics 22(14), e507–e513 (2006)CrossRefGoogle Scholar
  7. 7.
    Zhao, Z., Liu, H.: Spectral feature selection for supervised and unsupervised learning. In: Proceedings of the 24th International Conference on Machine learning, pp. 1151–1157. ACM (2007)Google Scholar
  8. 8.
    Yang, Y., Shen, H.T., Ma, Z., Huang, Z., Zhou, X.: L2, 1-norm regularized discriminative feature selection for unsupervised learning. In: IJCAI International Joint Conference on Artificial Intelligence, pp. 1589–1594 (2011)Google Scholar
  9. 9.
    Tabakhi, S., Moradi, P., Akhlaghian, F.: An unsupervised feature selection algorithm based on ant colony optimization. Eng. Appl. Artif. Intell. 32, 112–123 (2014)CrossRefGoogle Scholar
  10. 10.
    Solorio-Fernández, S., Martínez-Trinidad, J.F., Carrasco-Ochoa, J.A.: A new unsupervised spectral feature selection method for mixed data: a filter approach. Pattern Recogn. 72, 314–326 (2017)CrossRefGoogle Scholar
  11. 11.
    Kim, Y., Street, W.N., Menczer, F.: Evolutionary model selection in unsupervised learning. Intell. Data Anal. 6(6), 531–556 (2002)CrossRefGoogle Scholar
  12. 12.
    Law, M.H.C., Figueiredo, M.A.T., Jain, A.K.: Simultaneous feature selection and clustering using mixture models. IEEE Trans. Pattern Anal. Mach. Intell. 26(9), 1154–1166 (2004)CrossRefGoogle Scholar
  13. 13.
    Breaban, M., Luchian, H.: A unifying criterion for unsupervised clustering and feature selection. Pattern Recogn. 44(4), 854–865 (2011)CrossRefGoogle Scholar
  14. 14.
    Dutta, D., Dutta, P., Sil, J.: Simultaneous feature selection and clustering with mixed features by multi objective genetic algorithm. Int. J. Hybrid Intell. Syst. 11(1), 41–54 (2014)CrossRefGoogle Scholar
  15. 15.
    Dash, M., Liu, H.: Feature selection for clustering. In: Terano, T., Liu, H., Chen, A.L.P. (eds.) PAKDD 2000. LNCS (LNAI), vol. 1805, pp. 110–121. Springer, Heidelberg (2000). Scholar
  16. 16.
    Hruschka, E.R., Hruschka, E.R., Covoes, T.F., Ebecken, N.F.F.: Feature selection for clustering problems: a hybrid algorithm that iterates between k-means and a Bayesian filter. In: 2005 Fifth International Conference on Hybrid Intelligent Systems. HIS 2005. IEEE (2005)Google Scholar
  17. 17.
    Li, Y., Lu, B.L., Wu, Z.F.: A hybrid method of unsupervised feature selection based on ranking. In: 18th International Conference on Pattern Recognition. ICPR 2006, Hong Kong, China, pp. 687–690 (2006)Google Scholar
  18. 18.
    Solorio-Fernández, S., Carrasco-Ochoa, J., Martínez-Trinidad, J.: A new hybrid filter-wrapper feature selection method for clustering based on ranking. Neurocomputing 214, 866–880 (2016)CrossRefGoogle Scholar
  19. 19.
    Theodoridis, S., Koutroumbas, K.: Pattern Recognition. Elsevier Science, Amsterdam (2008)zbMATHGoogle Scholar
  20. 20.
    Liu, L., Kang, J., Yu, J., Wang, Z.: A comparative study on unsupervised feature selection methods for text clustering. In: Proceedings of 2005 IEEE International Conference on Natural Language Processing and Knowledge Engineering. IEEE NLP-KE 2005, pp. 597–601. IEEE (2005)Google Scholar
  21. 21.
    He, X., Niyogi, P.: Locality preserving projections. In: Advances in Neural Information Processing Systems, pp. 153–160 (2004)Google Scholar
  22. 22.
    Fukunaga, K.: Introduction to Statistical Pattern Recognition, 2nd edn. Academic Press Professional Inc., San Diego (1990)zbMATHGoogle Scholar
  23. 23.
    Alter, O., Alter, O.: Singular value decomposition for genome-wide expression data processing and modeling. Proc. Natl. Acad. Sci. U.S.A. 97(18), 10101–10106 (2000)CrossRefGoogle Scholar
  24. 24.
    Li, J., et al.: Feature selection: a data perspective. ACM Comput. Surv. (CSUR) 50(6), 94 (2017)CrossRefGoogle Scholar
  25. 25.
    Zhao, Z., Morstatter, F., Sharma, S., Alelyani, S., Anand, A., Liu, H.: Advancing feature selection research. ASU Feature Selection Repository (2010)Google Scholar
  26. 26.
    Fix, E., Hodges Jr., J.L.: Discriminatory analysis-nonparametric discrimination: consistency properties. Technical report, California University, Berkeley (1951)Google Scholar
  27. 27.
    Cortes, C., Vapnik, V.: Support-vector networks. Mach. Learn. 20(3), 273–297 (1995)zbMATHGoogle Scholar
  28. 28.
    Maron, M.E.: Automatic indexing: an experimental inquiry. J. ACM 8(3), 404–417 (1961)CrossRefGoogle Scholar
  29. 29.
    John, G.H., Langley, P.: Estimating continuous distributions in Bayesian classifiers. In: Proceedings of the Eleventh Conference on Uncertainty in Artificial Intelligence, pp. 338–345. Morgan Kaufmann Publishers Inc. (1995)Google Scholar
  30. 30.
    Hall, M., Frank, E., Holmes, G., Pfahringer, B., Reutemann, P., Witten, I.H.: The WEKA data mining software: an update. SIGKDD Explor. Newsl. 11(1), 10–18 (2009)CrossRefGoogle Scholar
  31. 31.
    Zhao, Z., Wang, L., Liu, H., Ye, J.: On similarity preserving feature selection. IEEE Trans. Knowl. Data Eng. 25(3), 619–632 (2013)CrossRefGoogle Scholar
  32. 32.
    Friedman, M.: The use of ranks to avoid the assumption of normality implicit in the analysis of variance. J. Am. Stat. Assoc. 32(200), 675–701 (1937)CrossRefGoogle Scholar
  33. 33.
    Holm, S.: A simple sequentially rejective multiple test procedure. Scand. J. Stat. 6(2), 65–70 (1979)MathSciNetzbMATHGoogle Scholar
  34. 34.
    Buhmann, M.D.: Radial Basis Functions: Theory and Implementations. Cambridge Monographs on Applied and Computational Mathematics. Cambridge University Press, Cambridge (2003)CrossRefGoogle Scholar

Copyright information

© Springer Nature Switzerland AG 2018

Authors and Affiliations

  • Saúl Solorio-Fernández
    • 1
    Email author
  • J. Ariel Carrasco-Ochoa
    • 1
  • José Fco. Martínez-Trinidad
    • 1
  1. 1.Computer Sciences DepartmentInstituto Nacional de Astrofísica, Óptica y ElectrónicaPueblaMexico

Personalised recommendations