A review of unsupervised feature selection methods

  • Saúl Solorio-FernándezEmail author
  • J. Ariel Carrasco-Ochoa
  • José Fco. Martínez-Trinidad


In recent years, unsupervised feature selection methods have raised considerable interest in many research areas; this is mainly due to their ability to identify and select relevant features without needing class label information. In this paper, we provide a comprehensive and structured review of the most relevant and recent unsupervised feature selection methods reported in the literature. We present a taxonomy of these methods and describe the main characteristics and the fundamental ideas they are based on. Additionally, we summarized the advantages and disadvantages of the general lines in which we have categorized the methods analyzed in this review. Moreover, an experimental comparison among the most representative methods of each approach is also presented. Finally, we discuss some important open challenges in this research area.


Unsupervised learning Dimensionality reduction Unsupervised feature selection Feature selection for clustering 



The first author gratefully acknowledges to the National Council of Science and Technology of Mexico (CONACyT) for his Ph.D. fellowship, through the scholarship 224490.


  1. Agrawal S, Agrawal J (2015) Survey on anomaly detection using data mining techniques. Procedia Comput Sci 60(1):708–713. Google Scholar
  2. Ahmed M, Mahmood AN, Islam MR (2016) A survey of anomaly detection techniques in financial domain. Future Genera Comput Syst 55:278–288. Google Scholar
  3. Alelyani S (2013) On feature selection stability: a data perspective. Arizona State University, TempeGoogle Scholar
  4. Alelyani S, Liu H, Wang L (2011) The effect of the characteristics of the dataset on the selection stability. In: Proceedings—international conference on tools with artificial intelligence, ICTAI, pp 970–977.
  5. Alelyani S, Tang J, Liu H (2013) Feature selection for clustering: a review. Data Cluster Algorithms Appl 29:110–121Google Scholar
  6. Alter O, Alter O (2000) Singular value decomposition for genome-wide expression data processing and modeling. Proc Natl Acad Sci USA 97(18):10101–10106Google Scholar
  7. Ambusaidi MA, He X, Nanda P (2015) Unsupervised feature selection method for intrusion detection system. In: Trustcom/BigDataSE/ISPA, 2015 IEEE, vol 1, pp 295–301.
  8. Ang JC, Mirzal A, Haron H, Hamed HNA (2016) Supervised, unsupervised, and semi-supervised feature selection: a review on gene selection. IEEE/ACM Trans Comput Biol Bioinform 13(5):971–989. Google Scholar
  9. Argyriou A, Evgeniou T, Pontil M (2008) Convex multi-task feature learning. Mach Learn 73(3):243–272Google Scholar
  10. Banerjee M, Pal NR (2014) Feature selection with SVD entropy: some modification and extension. Inf Sci 264:118–134. MathSciNetzbMATHGoogle Scholar
  11. Beni G, Wang J (1993) Swarm intelligence in cellular robotic systems. In: Dario P, Sandini G, Aebischer P (eds) Robots and biological systems: towards a new bionics?. Springer, Berlin, pp 703–712. Google Scholar
  12. Bharti KK, kumar Singh P (2014) A survey on filter techniques for feature selection in text mining. In: Proceedings of the second international conference on soft computing for problem solving (SocProS 2012), December 28–30, 2012. Springer, pp 1545–1559Google Scholar
  13. Bolón-Canedo V, Sánchez-Maroño N, Alonso-Betanzos A (2015) Feature selection for high-dimensional data.
  14. Boyd S, Parikh N, Chu E, Peleato B, Eckstein J et al (2011) Distributed optimization and statistical learning via the alternating direction method of multipliers. Found Trends Mach® Learn 3(1):1–122zbMATHGoogle Scholar
  15. Breaban M, Luchian H (2011) A unifying criterion for unsupervised clustering and feature selection. Pattern Recognit 44(4):854–865. Google Scholar
  16. Cai D, Zhang C, He X (2010) Unsupervised feature selection for multi-cluster data. In: Proceedings of the 16th ACM SIGKDD international conference on Knowledge discovery and data mining. ACM, pp 333–342Google Scholar
  17. Cai J, Luo J, Wang S, Yang S (2018) Feature selection in machine learning: a new perspective. Neurocomputing 0:1–10. Google Scholar
  18. Calinski T, Harabasz J (1974) A dendrite method for cluster analysis. Commun Stat Theory Methods 3(1):1–27.,
  19. Chakrabarti S, Frank E, Güting RH, Han J, Jiang X, Kamber M, Lightstone SS, Nadeau TP, Neapolitan RE et al (2008) Data mining: know it all. Elsevier Science.
  20. Chandrashekar G, Sahin F (2014) A survey on feature selection methods. Comput Electr Eng 40(1):16–28. Google Scholar
  21. Chung FRK (1997) Spectral graph theory, vol 92. American Mathematical Society, ProvidencezbMATHGoogle Scholar
  22. Cortes C, Vapnik V (1995) Support-vector networks. Mach Learn 20(3):273–297zbMATHGoogle Scholar
  23. Cover TM, Thomas JA (2006) Elements of information theory, 2nd edn. Wiley, New YorkzbMATHGoogle Scholar
  24. Dadaneh BZ, Markid HY, Zakerolhosseini A (2016) Unsupervised probabilistic feature selection using ant colony optimization. Expert Syst Appl 53:27–42. Google Scholar
  25. Daniels MJ, Normand SLT (2005) Longitudinal profiling of health care units based on continuous and discrete patient outcomes. Biostatistics 7(1):1–15zbMATHGoogle Scholar
  26. Dash M, Liu H (2000) Feature selection for Clustering. In: Terano T, Liu H, Chen ALP (eds) Knowledge discovery and data mining. Current issues and new applications, vol 1805, pp 110–121.
  27. Dash M, Ong YS (2011) RELIEF-C: efficient feature selection for clustering over noisy data. In: 2011 23rd IEEE international conference on tools with artificial intelligence (ICTAI). IEEE, pp 869–872Google Scholar
  28. Dash M, Liu H, Yao J (1997) Dimensionality reduction of unsupervised data. In: Proceedings Ninth IEEE international conference on tools with artificial intelligence. IEEE Computer Society, pp 532–539.,
  29. Dash M, Choi K, Scheuermann P, Liu HLH (2002) Feature selection for clustering—a filter solution. In: 2002 Proceedings 2002 IEEE international conference on data mining. pp 115–122.
  30. De Leon AR, Chough KC (2013) Analysis of mixed data: methods and applications. CRC Press, LondonzbMATHGoogle Scholar
  31. Dempster AP, Laird NM, Rubin DB (1977) Maximum Likelihood from Incomplete Data via the EM-Alogrithm, vol 39., arXiv:0710.5696v2
  32. Devakumari D, Thangavel K (2010) Unsupervised adaptive floating search feature selection based on Contribution Entropy. In: 2010 International conference on communication and computational intelligence (INCOCCI). IEEE, pp 623–627Google Scholar
  33. Devaney M, Ram A (1997) Efficient feature selection in conceptual clustering. In: ICML ’97 Proceedings of the fourteenth international conference on machine learning. pp 92–97. Morgan Kaufmann Publishers Inc, San Francisco, CA.
  34. Devijver PA, Kittler J (1982) Pattern recognition: a statistical approach. Pattern recognition: a statistical approach.
  35. Dong G, Liu H (2018) Feature engineering for machine learning and data analytics. CRC Press.
  36. Donoho DL, Tsaig Y (2008) Fast solution of-norm minimization problems when the solution may be sparse. IEEE Trans Inf Theory 54(11):4789–4812MathSciNetzbMATHGoogle Scholar
  37. Dorigo M, Gambardella LM (1997) Ant colony system: a cooperative learning approach to the traveling salesman problem. IEEE Trans Evolut Comput 1(1):53–66Google Scholar
  38. Du S, Ma Y, Li S, Ma Y (2017) Robust unsupervised feature selection via matrix factorization. Neurocomputing 241:115–127. Google Scholar
  39. Dutta D, Dutta P, Sil J (2014) Simultaneous feature selection and clustering with mixed features by multi objective genetic algorithm. Int J Hybrid Intell Syst 11(1):41–54Google Scholar
  40. Dy JG, Brodley CE (2004) Feature selection for unsupervised learning. J Mach Learn Res 5:845–889. MathSciNetzbMATHGoogle Scholar
  41. El Ghaoui L, Li GC, Duong VA, Pham V, Srivastava AN, Bhaduri K (2011) Sparse machine learning methods for understanding large text corpora. In: CIDU, pp 159–173Google Scholar
  42. Feldman R, Sanger J (2006) The text mining handbook. Cambridge university press.,, arXiv:1011.1669v3
  43. Ferreira AJ, Figueiredo MA (2012) An unsupervised approach to feature discretization and selection. Pattern Recognit 45(9):3048–3060. Google Scholar
  44. Figueiredo MAT, Jain AK (2002) Unsupervised learning of finite mixture models. IEEE Trans Pattern Anal Mach Intell 24(3):381–396. Google Scholar
  45. Fisher DH (1987) Knowledge acquisition via incremental conceptual clustering. Mach Learn 2(2):139–172. Google Scholar
  46. Fix E, Hodges Jr JL (1951) Discriminatory analysis-nonparametric discrimination: consistency properties. Technical report. California University BerkeleyGoogle Scholar
  47. Forman G (2003) An extensive empirical study of feature selection metrics for text classification. J Mach Learn Res 3:1289–1305zbMATHGoogle Scholar
  48. Fowlkes EB, Gnanadesikan R, Kettenring JR (1988) Variable selection in clustering. J Classif 5(2):205–228. MathSciNetGoogle Scholar
  49. Friedman M (1937) The use of ranks to avoid the assumption of normality implicit in the analysis of variance. J Am Stat Assoc 32(200):675–701. zbMATHGoogle Scholar
  50. Friedman J, Hastie T, Tibshirani R (2001) The elements of statistical learning, 1st edn. Springer series in statistics. Springer, New YorkzbMATHGoogle Scholar
  51. Fukunaga K (1990) Introduction to statistical pattern recognition, vol 22.,, arXiv:1011.1669v3
  52. García S, Luengo J, Herrera F (2015) Data preprocessing in data mining, 72nd edn. Springer, New York. Google Scholar
  53. Garcia-Garcia D, Santos-Rodriguez R (2009) Spectral clustering and feature selection for microarray data. In: International conference on machine learning and applications, 2009 ICMLA ’09 pp 425–428.
  54. Gu S, Zhang L, Zuo W, Feng X (2014) Projective dictionary pair learning for pattern classification. In: Advances in neural information processing systems, pp 793–801Google Scholar
  55. Guo J, Zhu W (2018) Dependence guided unsupervised feature selection. In: Aaai, pp 2232–2239Google Scholar
  56. Guo J, Guo Y, Kong X, He R (2017) Unsupervised feature selection with ordinal locality school of information and communication engineering. Dalian University of Technology National, Laboratory of Pattern Recognition, CASIA Center for Excellence in Brain Science and Intelligence Technology, DalianGoogle Scholar
  57. Guyon I, Elisseeff A, De AM (2003) An introduction to variable and feature selection. J Mach Learn Res 3:1157–1182., arXiv:1111.6189v1
  58. Haindl M, Somol P, Ververidis D, Kotropoulos C (2006) Feature selection based on mutual correlation. In: Progress in pattern recognition, image analysis and applications, pp 569–577Google Scholar
  59. Hall MA (1999) Correlation-based feature selection for machine learning. Ph.D. thesis, University of Waikato HamiltonGoogle Scholar
  60. Hall M, Frank E, Holmes G, Pfahringer B, Reutemann P, Witten IH (2009) The WEKA data mining software: an update. SIGKDD Explor Newsl 11(1):10–18. Google Scholar
  61. Han J, Sun Z, Hao H (2015) Selecting feature subset with sparsity and low redundancy for unsupervised learning. Knowl Based Syst 86:210–223. Google Scholar
  62. He X, Niyogi P (2004) Locality preserving projections. In: Advances in neural information processing systems, pp 153–160Google Scholar
  63. He X, Cai D, Niyogi P (2005) Laplacian score for feature selection. In: Advances in neural information processing systems 18, vol 186, pp 507–514Google Scholar
  64. Hou C, Nie F, Yi D, Wu Y (2011) Feature selection via joint embedding learning and sparse regression. In: IJCAI Proceedings-international joint conference on artificial intelligence, Citeseer, vol 22. pp 1324Google Scholar
  65. Hou C, Nie F, Li X, Yi D, Wu Y (2014) Joint embedding learning and sparse regression: a framework for unsupervised feature selection. IEEE Trans Cybern 44(6):793–804Google Scholar
  66. Hruschka ER, Covoes TF (2005) Feature selection for cluster analysis: an approach based on the simplified Silhouette criterion. In: 2005 and international conference on intelligent agents, web technologies and internet commerce, international conference on computational intelligence for modelling, control and automation, vol 1. IEEE, pp 32–38Google Scholar
  67. Hruschka ER, Hruschka ER, Covoes TF, Ebecken NFF (2005) Feature selection for clustering problems: a hybrid algorithm that iterates between k-means and a Bayesian filter. In: Fifth international conference on hybrid intelligent systems, 2005. HIS ’05. IEEE.
  68. Hruschka ER, Covoes TF, Hruschka JER, Ebecken NFF (2007) Adapting supervised feature selection methods for clustering tasks. In: Methods for clustering tasks in managing worldwide operations and communications with information technology (IRMA 2007 proceedings), information resources management association (IRMA) international conference vancouver 2007 99-102 Hershey: Idea Group Publishing.
  69. Hu J, Xiong C, Shu J, Zhou X, Zhu J (2009) An improved text clustering method based on hybrid model. Int J Modern Educ Comput Sci 1(1):35Google Scholar
  70. Huang Z (1997) Clustering large data sets with mixed numeric and categorical values. In: Proceedings of the 1st Pacific-Asia conference on knowledge discovery and data mining,(PAKDD), Singapore. pp 21–34Google Scholar
  71. Huang Z (1998) Extensions to the k-means algorithm for clustering large data sets with categorical values. Data Min Knowl Discov 2(3):283–304MathSciNetGoogle Scholar
  72. Jashki A, Makki M, Bagheri E, Ghorbani AA (2009) An iterative hybrid filter-wrapper approach to feature selection for document clustering. In: Proceedings of the 22nd Canadian conference on artificial intelligence (AI’09) 2009Google Scholar
  73. John GH, Langley P (1995) Estimating continuous distributions in Bayesian classifiers. In: Proceedings of the eleventh conference on uncertainty in artificial intelligence. Morgan Kaufmann Publishers Inc., pp 338–345Google Scholar
  74. Kim Y, Gao J (2006) Unsupervised gene selection for high dimensional data. In: Sixth IEEE symposium on bioinformatics and bioengineering (BIBE’06), pp 227–234.,
  75. Kim Y, Street WN, Menczer F (2002) Evolutionary model selection in unsupervised learning. Intell Data Anal 6(6):531–556zbMATHGoogle Scholar
  76. Kong D, Ding C, Huang H (2011) Robust nonnegative matrix factorization using l21-norm. In: Proceedings of the 20th ACM international conference on Information and knowledge management (CIKM), pp 673–682.,
  77. Kotsiantis SB (2011) Feature selection for machine learning classification problems: a recent overview. Artifi Intell Rev 42:157–176. Google Scholar
  78. Law MHC, Figueiredo MAT, Jain AK (2004) Simultaneous feature selection and clustering using mixture models. IEEE Trans Pattern Anal Mach Intell 26(9):1154–1166Google Scholar
  79. Lazar C, Taminau J, Meganck S, Steenhoff D, Coletta A, Molter C, De Schaetzen V, Duque R, Bersini H, Nowé A (2012) A survey on filter techniques for feature selection in gene expression microarray analysis. IEEE/ACM Trans Comput Biol Bioinform 9(4):1106–1119. Google Scholar
  80. Lee W, Stolfo SJ, Mok KW (2000) Adaptive intrusion detection: a data mining approach. Artif Intell Rev 14(6):533–567zbMATHGoogle Scholar
  81. Lee PY, Loh WP, Chin JF (2017) Feature selection in multimedia: the state-of-the-art review. Image Vis Comput 67:29–42. Google Scholar
  82. Li Z, Tang J (2015) Unsupervised feature selection via nonnegative spectral analysis and redundancy control. IEEE Trans Image Process 24(12):5343–5355.,
  83. Li Y, Lu BL, Wu ZF (2006) A hybrid method of unsupervised feature selection based on ranking. In: 18th international conference on pattern recognition (ICPR’06), Hong Kong, China, pp 687–690.,
  84. Li Y, Lu BL, Wu ZF (2007) Hierarchical fuzzy filter method for unsupervised feature selection. J Intell Fuzzy Syst 18(2):157–169.
  85. Li Z, Yang Y, Liu J, Zhou X, Lu H (2012) Unsupervised feature selection using nonnegative spectral analysis. In: AAAIGoogle Scholar
  86. Li Z, Cheong LF, Zhou SZ (2014a) SCAMS: Simultaneous clustering and model selection. In: Proceedings of the IEEE computer society conference on computer vision and pattern recognition, pp 264–271.
  87. Li Z, Liu J, Yang Y, Zhou X, Lu H (2014b) Clustering-guided sparse structural learning for unsupervised feature selection. IEEE Trans Knowl Data Eng 26(9):2138–2150Google Scholar
  88. Li J, Cheng K, Wang S, Morstatter F, Trevino RP, Tang J, Liu H (2016) Feature selection: a data perspective. J Mach Learn Res 1–73. arXiv:1601.07996
  89. Lichman M (2013) UCI Machine learning repository.
  90. Liu H, Motoda H (1998) Feature selection for knowledge discovery and data mining., arXiv:1011.1669v3
  91. Liu H, Motoda H (2007) Computational methods of feature selection. CRC Press, LondonzbMATHGoogle Scholar
  92. Liu DC, Nocedal J (1989) On the limited memory BFGS method for large scale optimization. Math Program 45(1–3):503–528., arXiv:1011.1669v3
  93. Liu H, Yu L, Member SS, Yu L, Member SS (2005) Toward integrating feature selection algorithms for classification and clustering. IEEE Trans Knowl Data Eng 17(4):491–502. Google Scholar
  94. Liu J, Ji S, Ye J (2009a) Multi-task feature learning via efficient l 2, 1-norm minimization. In: Proceedings of the twenty-fifth conference on uncertainty in artificial intelligence. AUAI Press, pp 339–348Google Scholar
  95. Liu R, Yang N, Ding X, Ma L (2009b) An unsupervised feature selection algorithm: Laplacian score combined with distance-based entropy measure. In: 3rd international symposium on intelligent information technology application, IITA 2009, vol 3, pp 65–68.
  96. Liu H, Wei R, Jiang G (2013) A hybrid feature selection scheme for mixed attributes data. Comput Appl Math 32(1):145–161MathSciNetzbMATHGoogle Scholar
  97. Lu Q, Li X, Dong Y (2018) Structure preserving unsupervised feature selection. Neurocomputing 301:36–45. Google Scholar
  98. Luo Y, Xiong S (2009) Clustering ensemble for unsupervised feature selection. In: Fourth international conference on fuzzy systems and knowledge discovery. IEEE Computer Society, Los Alamitos, vol 1, pp 445–448.
  99. Luo M, Nie F, Chang X, Yang Y, Hauptmann AG, Zheng Q (2018) Adaptive unsupervised feature selection with structure regularization. IEEE Trans Neural Netw Learn Syst 29(4):944–956.,
  100. Luxburg U (2007) A tutorial on spectral clustering. Stat Comput 17(4):395–416.,
  101. MacQueen JB (1967) Some methods for classification and analysis of multivariate observations. In: Proceedings of 5-th Berkeley symposium on mathematical statistics and probability, vol 1, pp 281–297.
  102. Mao K (2005) Identifying critical variables of principal components for unsupervised feature selection. Syst Man Cybern Part B Cybern 35(2):339–44. MathSciNetGoogle Scholar
  103. Maron ME (1961) Automatic indexing: an experimental inquiry. J ACM 8(3):404–417.,
  104. Miao J, Niu L (2016) A survey on feature selection. Procedia Comput Sci 91(Itqm):919–926. Google Scholar
  105. Mitra PFSUFS, Ca M, Pal SK (2002) Unsupervised feature selection using feature similarity. IEEE Trans Pattern Anal Mach Intelligence 24(3):301–312. Google Scholar
  106. Mugunthadevi K, Punitha SC, Punithavalli M (2011) Survey on feature selection in document clustering. Int J Comput Sci Eng 3(3):1240–1244.
  107. Nie F, Huang H, Cai X, Ding CH (2010) Efficient and robust feature selection via joint 2, 1-norms minimization. In: Advances in neural information processing systems, pp 1813–1821Google Scholar
  108. Nie F, Zhu W, Li X (2016) Unsupervised feature selection with structured graph optimization. In: Proceedings of the 30th conference on artificial intelligence (AAAI 2016), vol 13, No. 9, pp 1302–1308Google Scholar
  109. Niijima S, Okuno Y (2009) Laplacian linear discriminant analysis approach to unsupervised feature selection. IEEE ACM Trans Comput Biol Bioinform 6(4):605–614. Google Scholar
  110. Osborne MR, Presnell B, Turlach BA (2000) On the lasso and its dual. J Comput Graph Stat 9(2):319–337MathSciNetGoogle Scholar
  111. Padungweang P, Lursinsap C, Sunat K (2009) Univariate filter technique for unsupervised feature selection using a new Laplacian score based local nearest neighbors. In: Asia-Pacific conference on information processing, 2009. APCIP 2009, vol 2. IEEE, pp 196–200Google Scholar
  112. Pal SK, Mitra P (2004) Pattern Recognit Algorithms Data Min, 1st edn. Chapman and Hall/CRC, LondonGoogle Scholar
  113. Pal SK, De RK, Basak J (2000) Unsupervised feature evaluation: a neuro-fuzzy approach. IEEE Trans Neural Netw 11(2):366–376Google Scholar
  114. Peng H, Long F, Ding C (2005) Feature selection based on mutual information: criteria of max-dependency, max-relevance, and min-redundancy. IEEE Trans Pattern Anal Mach Intell 27(8):1226–1238. Google Scholar
  115. Qian M, Zhai C (2013) Robust unsupervised feature selection. In: Proceedings of the twenty-third international joint conference on artificial intelligence, pp 1621–1627.
  116. Rao VM, Sastry VN (2012) Unsupervised feature ranking based on representation entropy. In: 2012 1st international conference on recent advances in information technology, RAIT-2012, pp 421–425.
  117. Ritter G (2015) Robust cluster analysis and variable selection, vol 137. CRC Press, LondonzbMATHGoogle Scholar
  118. Roth V, Lange T (2004) Feature selection in clustering problems. Adv Neural Inf Process Syst 16:473–480Google Scholar
  119. Roweis ST, Saul LK (2000) Nonlinear dimensionality reduction by locally linear embedding. Science (New York, NY) 290(5500):2323–2326. Google Scholar
  120. Saeys Y, Inza I, Larrañaga P (2007) A review of feature selection techniques in bioinformatics. Bioinformatics 23(19):2507–2517. Google Scholar
  121. Sheikhpour R, Sarram MA, Gharaghani S, Chahooki MAZ (2017) A survey on semi-supervised feature selection methods. Pattern Recognit 64(2016):141–158. Google Scholar
  122. Shi L, Du L, Shen YD (2015) Robust spectral learning for unsupervised feature selection. In: Proceedings—IEEE international conference on data mining, ICDM 2015-Janua, pp 977–982.
  123. Shi Y, Miao J, Wang Z, Zhang P, Niu L (2018) Feature Selection With L2,1–2 Regularization. IEEE Trans Neural Netw Learn Syst 29(10):4967–4982.,
  124. Solorio-Fernández S, Carrasco-Ochoa J, Martínez-Trinidad J (2016) A new hybrid filterwrapper feature selection method for clustering based on ranking. Neurocomputing 214,
  125. Solorio-Fernández S, Martínez-Trinidad JF, Carrasco-Ochoa JA (2017) A new unsupervised spectral feature selection method for mixed data: a filter approach. Pattern Recognit 72:314–326. Google Scholar
  126. Swets D, Weng J (1995) Efficient content-based image retrieval using automatic feature selection. Proceedings, international symposium on computer vision, 1995. pp 85–90,
  127. Tabakhi S, Moradi P (2015) Relevance-redundancy feature selection based on ant colony optimization. Pattern Recognit 48(9):2798–2811. Google Scholar
  128. Tabakhi S, Moradi P, Akhlaghian F (2014) An unsupervised feature selection algorithm based on ant colony optimization. Eng Appl Artif Intell 32:112–123. Google Scholar
  129. Tabakhi S, Najafi A, Ranjbar R, Moradi P (2015) Gene selection for microarray data classification using a novel ant colony optimization. Neurocomputing 168:1024–1036. Google Scholar
  130. Talavera L (2000) Dependency-based feature selection for clustering symbolic data. Intell Data Anal 4:19–28zbMATHGoogle Scholar
  131. Tang J, Liu H (2014) An unsupervised feature selection framework for social media data. IEEE Trans Knowl Data Eng 26(12):2914–2927Google Scholar
  132. Tang J, Alelyani S, Liu H (2014) Feature selection for classification: a review. In: Data Classification, CRC Press, pp 37–64.
  133. Tang C, Liu X, Li M, Wang P, Chen J, Wang L, Li W (2018a) Robust unsupervised feature selection via dual self-representation and manifold regularization. Knowl Based Syst 145:109–120. Google Scholar
  134. Tang C, Zhu X, Chen J, Wang P, Liu X, Tian J (2018b) Robust graph regularized unsupervised feature selection. Expert Syst Appl 96:64–76. Google Scholar
  135. Theodoridis S, Koutroumbas K (2008a) Pattern recognition. Elsevier Science.
  136. Theodoridis S, Koutroumbas K (2008b) Pattern recognition, 4th edn. Academic Press, New YorkzbMATHGoogle Scholar
  137. Tibshirani R (1996) Regression shrinkage and selection via the lasso. J R Stat Soc Ser B (Methodological) 58:267–288MathSciNetzbMATHGoogle Scholar
  138. Tou JT, González RC (1974) Pattern recognition principles. Addison-Wesley Pub. Co.
  139. Varshavsky R, Gottlieb A, Linial M, Horn D (2006) Novel unsupervised feature filtering of biological data. Bioinformatics 22(14):e507–e513.,
  140. Vergara JR, Estévez PA (2014) A review of feature selection methods based on mutual information. Neural Comput Appl 24(1):175–186,, arXiv:1509.07577
  141. Wang S, Wang H (2017) Unsupervised feature selection via low-rank approximation and structure learning. Knowl Based Syst 124:70–79. Google Scholar
  142. Wang S, Pedrycz W, Zhu Q, Zhu W (2015a) Unsupervised feature selection via maximum projection and minimum redundancy. Knowl Based Syst 75:19–29. zbMATHGoogle Scholar
  143. Wang S, Tang J, Liu H (2015b) Embedded unsupervised feature selection. In: Twenty-ninth AAAI conference on artificial intelligence, p 7Google Scholar
  144. Wang X, Zhang X, Zeng Z, Wu Q, Zhang J (2016) Unsupervised spectral feature selection with l1-norm graph. Neurocomputing 200:47–54. Google Scholar
  145. Webb AR (2003) Statistical pattern recognition, vol 35, 2nd edn. Wliey, New York. zbMATHGoogle Scholar
  146. Wu M, Schölkopf B (2007) A local learning approach for clustering. In: Advances in neural information processing systems, pp 1529–1536Google Scholar
  147. Yang Y, Liao Y, Meng G, Lee J (2011a) A hybrid feature selection scheme for unsupervised learning and its application in bearing fault diagnosis. Expert Syst Appl 38(9):11311–11320.
  148. Yang Y, Shen HT, Ma Z, Huang Z, Zhou X (2011b) L2,1-Norm regularized discriminative feature selection for unsupervised learning. In: IJCAI international joint conference on artificial intelligence, pp 1589–1594.
  149. Yasmin M, Mohsin S, Sharif M (2014) Intelligent image retrieval techniques: a survey. J Appl Res Technology 12(1):87–103Google Scholar
  150. Yen CC, Chen LC, Lin SD (2010) Unsupervised feature selection: minimize information redundancy of features. In: Proceedings—international conference on technologies and applications of artificial intelligence, TAAI 2010. pp 247–254.
  151. Yi Y, Zhou W, Cao Y, Liu Q, Wang J (2016) Unsupervised feature selection with graph regularized nonnegative self-representation. In: You Z, Zhou J, Wang Y, Sun Z, Shan S, Zheng W, Feng J, Zhao Q (eds) Biometric recognition: 11th Chinese conference, CCBR 2016, Chengdu, China, October 14–16, 2016, Proceedings. Springer International Publishing, Cham, pp 591–599.
  152. Yu L (2005) Toward integrating feature selection algorithms for classification and clustering. IEEE Trans Knowl Data Eng 17(4):491–502Google Scholar
  153. Yu J (2011) A hybrid feature selection scheme and self-organizing map model for machine health assessment. Appl Soft Comput 11(5):4041–4054Google Scholar
  154. Zafarani R, Abbasi MA, Liu H (2014) Social media mining: an introduction. Cambridge University Press, CambridgeGoogle Scholar
  155. Zeng H, Cheung YM (2011) Feature selection and kernel learning for local learning-based clustering. IEEE Trans Pattern Anal Mach Intell 33(8):1532–1547. Google Scholar
  156. Zhao Z (2010) Spectral feature selection for mining ultrahigh dimensional data. Ph.d thesis, TempeGoogle Scholar
  157. Zhao Z, Liu H (2007) Spectral feature selection for supervised and unsupervised learning. In: Proceedings of the 24th international conference on machine learning. ACM, pp 1151–1157Google Scholar
  158. Zhao Z, Liu H (2011) Spectral feature selection for data mining. CRC Press. pp 1–216.
  159. Zhao Z, Wang L, Liu H, Ye J (2013) On similarity preserving feature selection. IEEE Trans Knowl Data Eng 25(3):619–632.,
  160. Zheng Z, Lei W, Huan L (2010) Efficient spectral feature selection with minimum redundancy. In: Twenty-fourth AAAI conference on artificial intelligence, pp 1–6Google Scholar
  161. Zhou W, Wu C, Yi Y, Luo G (2017) Structure preserving non-negative feature self-representation for unsupervised feature selection. IEEE Access 5:8792–8803. Google Scholar
  162. Zhu P, Zuo W, Zhang L, Hu Q, Shiu SCK (2015) Unsupervised feature selection by regularized self-representation. Pattern Recognit 48(2):438–446zbMATHGoogle Scholar
  163. Zhu P, Hu Q, Zhang C, Zuo W (2016) Coupled dictionary learning for unsupervised feature selection. In: AAAI, pp 2422–2428Google Scholar
  164. Zhu P, Zhu W, Wang W, Zuo W, Hu Q (2017) Non-convex regularized self-representation for unsupervised feature selection. Image Vis Comput 60:22–29. Google Scholar

Copyright information

© Springer Nature B.V. 2019

Authors and Affiliations

  1. 1.Computer Sciences DepartmentInstituto Nacional de Atrofísica, Óptica y ElectrónicaPueblaMexico

Personalised recommendations