Advertisement

Adaptive graph learning and low-rank constraint for supervised spectral feature selection

  • Zhi ZhongEmail author
Multi-Source Data Understanding (MSDU)
  • 56 Downloads

Abstract

Spectral feature selection (SFS) effectively improves performance of feature selection by introducing a graph matrix to preserve information of data. However, conventional SFS (1) generally preserves either global structure or local structure of data in selected subset, which is not capable of providing comprehensive information for model to output a robust result; (2) constructs graph matrix by original data, which usually lead to a suboptimal graph matrix because of redundant information; (3) conducts feature selection task depending on the fixed graph matrix, which is easily trapped in local optimization. Thus, we have proposed a novel SFS to (1) preserve both local information and global information of original data in feature-selected subset to provide comprehensive information for learning model; (2) integrate graph construction and feature selection to propose a robust spectral feature selection easily obtaining global optimization of feature selection. Besides, for the proposed problem, we further provide a optimization algorithm to effectively tackle the problem with a fast convergence. The extensive experimental results showed that our proposed method outperforms state-of-the-art feature selection methods, in terms of classification performance.

Keywords

Graph learning Low-rank constraint Spectral feature selection 

Notes

Acknowledgements

This work was supported by the program of Research and development of intelligent logistics management system based on Beidou multi-functional information acquisition and monitoring terminal (Grant No: 2016AB04097).

Compliance with ethical standards

Conflict of interest

We wish to draw the attention of the Editor to the following facts which may be considered as potential conflicts of interest and to significant financial contributions to this work.

References

  1. 1.
    Zhu X, Huang Z, Yang Y, Shen HT, Changsheng X, Luo J (2013) Self-taught dimensionality reduction on the high-dimensional small-sized data. Pattern Recognit 46(1):215–229CrossRefzbMATHGoogle Scholar
  2. 2.
    Gao L, Song J, Liu X, Shao J, Liu J, Shao J (2017) Learning in high-dimensional multimedia data: the state of the art. Multimed Syst 23(3):303–313CrossRefGoogle Scholar
  3. 3.
    Zhu P, Zuo W, Zhang L, Qinghua H, Shiu SCK (2015) Unsupervised feature selection by regularized self-representation. Pattern Recognit 48(2):438–446CrossRefzbMATHGoogle Scholar
  4. 4.
    Nie Feiping, Huang Heng, Cai Xiao, Ding Chris HQ (2010) Efficient and robust feature selection via joint l2,1-norms minimization. In: NIPS, pp 1813–1821Google Scholar
  5. 5.
    Xiaodong Wang X, Zhang ZZ, Qun W, Zhang J (2016) Unsupervised spectral feature selection with l1-norm graph. Neurocomputing 200:47–54CrossRefGoogle Scholar
  6. 6.
    Cai D, Zhang C, He X (2010) Unsupervised feature selection for multi-cluster data. In: ACM SIGKDD, pp 333–342Google Scholar
  7. 7.
    Gu Q, Li Z, Han J (2011) Joint feature selection and subspace learning. In: IJCAI, pp 1294–1299Google Scholar
  8. 8.
    Nie F, Dong X, Tsang WH, Zhang C (2010) Flexible manifold embedding: a framework for semi-supervised and unsupervised dimension reduction. IEEE Trans Image Process 19(7):1921–1932MathSciNetCrossRefzbMATHGoogle Scholar
  9. 9.
    Hinrichs A, Novak E, Ullrich M, Wozniakowski H (2014) The curse of dimensionality for numerical integration of smooth functions ii. J Complex 30(2):117–143MathSciNetCrossRefzbMATHGoogle Scholar
  10. 10.
    Zhao Z, Liu H (2007) Spectral feature selection for supervised and unsupervised learning. In: ICML, pp 1151–1157Google Scholar
  11. 11.
    Song J, Gao L, Zou F, Yan Y, Sebe N (2016) Deep and fast: deep learning hashing with semi-supervised graph construction. Image Vis Comput 55:101–108CrossRefGoogle Scholar
  12. 12.
    Zhu X, Li X, Zhang S, Zongben X, Litao Y, Wang C (2017) Graph PCA hashing for similarity search. IEEE Trans Multimed 19(9):2033–2044CrossRefGoogle Scholar
  13. 13.
    Yang J, Frangi AF, Yang JY, Zhang D, Jin Z (2005) Kpca plus lda: a complete kernel fisher discriminant framework for feature extraction and recognition. IEEE Trans Pattern Anal Mach Intell 27(2):230CrossRefGoogle Scholar
  14. 14.
    Cai X, Nie F, Huang H (2013) Exact top-k feature selection via l2,0 -norm constraint. In: IJCAI, pp 1240–1246Google Scholar
  15. 15.
    Zhu X, Zhang S, He W, Hu R, Lei C, Zhu P (2018) One-step multi-view spectral clustering. IEEE Trans Knowl Data Eng.  https://doi.org/10.1109/TKDE.2018.2873378
  16. 16.
    Zheng W, Zhu X, Wen G, Zhu Y, Yu H, Gan J (2018) Unsupervised feature selection by self-paced learning regularization. Pattern Recognit Lett.  https://doi.org/10.1016/j.patrec.2018.06.029
  17. 17.
    Nie F, Zhu W, Li X (2016) Unsupervised feature selection with structured graph optimization. In: AAAI, pp 1302–1308Google Scholar
  18. 18.
    Zhu X, Li X, Zhang S, Chunhua J, Xindong W (2017) Robust joint graph sparse coding for unsupervised spectral feature selection. IEEE Trans Neural Netw Learn Syst 28(6):1263–1275MathSciNetCrossRefGoogle Scholar
  19. 19.
    Zhu X, Zhang S, Rongyao H, Zhu Y, Song J (2018) Local and global structure preservation for robust unsupervised spectral feature selection. IEEE Trans Knowl Data Eng 30(3):517–529CrossRefGoogle Scholar
  20. 20.
    Chang X, Nie F, Yang Y, Huang H (2014) A convex formulation for semi-supervised multi-label feature selection. In: AAAI, pp 1171–1177Google Scholar
  21. 21.
    Zhu Y, Liang Z, Liu X, Sun K (2017) Self-representation graph feature selection method for classification. Multimed Syst 23(3):1–6Google Scholar
  22. 22.
    Zhu Y, Zhong Z, Cao W, Cheng D (2016) Graph feature selection for dementia diagnosis. Neurocomputing 195(C):19–22CrossRefGoogle Scholar
  23. 23.
    Zheng W, Zhu X, Zhu Y, Rongyao H, Lei C (2018) Dynamic graph learning for spectral feature selection. Multimed Tools Appl 77(22):29739–29755CrossRefGoogle Scholar
  24. 24.
    Zhu X, Suk H-II, Shen D (2014) Sparse discriminative feature selection for multi-class Alzheimers disease classification. In: MLMI, pp 157–164Google Scholar
  25. 25.
    Hou C, Nie F, Yi D, Wu Y (2011) Feature selection via joint embedding learning and sparse regression. In: IJCAI, pp 1324–1329Google Scholar
  26. 26.
    Ding W, Wu X, Zhang S, Zhu X (2013) Feature selection by joint graph sparse coding. In: SIAM, pp 803–811Google Scholar
  27. 27.
    Lei C, Zhu X (2018) Unsupervised feature selection via local structure learning and sparse learning. Multimed Tools Appl 77(22):29605–29622CrossRefGoogle Scholar
  28. 28.
    Zhu Y, Zhu X, Kim M, Shen D, Wu G (2016) Early diagnosis of Alzheimers disease by joint feature selection and classification on temporally structured support vector machine. In: MICCAI, pp 264–272Google Scholar
  29. 29.
    Rongyao H, Zhu X, Cheng D, He W, Yan Y, Song J, Zhang S (2017) Graph self-representation method for unsupervised feature selection. Neurocomputing 220:130–137CrossRefGoogle Scholar
  30. 30.
    Liu H, Ma Z, Zhang S, Xindong W (2015) Penalized partial least square discriminant analysis with l1- norm for multi-label data. Pattern Recognit 48(5):1724–1733CrossRefGoogle Scholar
  31. 31.
    Zhu X, Suk H-I, Wang L, Lee S-W, Shen D (2017) A novel relational regularization feature selection method for joint regression and classification in AD diagnosis. Medical Image Anal 38:205–214CrossRefGoogle Scholar
  32. 32.
    Zhu X, Zhang L, Huang Z (2014) A sparse embedding and least variance encoding approach to hashing. IEEE Trans Image Process 23(9):3737–3750MathSciNetCrossRefzbMATHGoogle Scholar
  33. 33.
    Zhu X, Suk H-II, Huang H, Shen D (2017) Low-rank graph-regularized structured sparse regression for identifying genetic biomarkers. IEEE Trans Big Data.  https://doi.org/10.1109/TBDATA.2017.2735991
  34. 34.
    Cai X, Ding C, Nie F, Huang H (2013) On the equivalent of low-rank linear regressions and linear discriminant analysis based regressions. In: ACM SIGKDD, pp 1124–1132Google Scholar
  35. 35.
    Zhu X, Zhang S, Li Y, Zhang J, Yang L, Fang Y (2018) Low-rank sparse subspace for spectral clustering. IEEE Trans Knowl Data Eng PP(99):1–1Google Scholar
  36. 36.
    Zhang S, Li X, Zong M, Zhu X, Cheng D (2017) Learning k for knn classification. Acm Trans Intell Syst Technol 8(3):43Google Scholar
  37. 37.
    Zhang S, Li X, Zong M, Zhu X, Wang R (2018) Efficient knn classification with different numbers of nearest neighbors. IEEE Trans Neural Netw Learn Syst 29(5):1774–1785MathSciNetCrossRefGoogle Scholar
  38. 38.
    Daubechies I, DeVore RA, Fornasier M, Gunturk CS (2008) Iteratively re-weighted least squares minimization: proof of faster than linear rate for sparse recovery. In: CISS, pp 26–29Google Scholar
  39. 39.
    Boyd S, Vandenberghe L, Faybusovich L (2006) Convex optimization. IEEE Trans Autom Control 51(11):1859–1859CrossRefGoogle Scholar
  40. 40.
    Qian M, Zhai C (2013) Robust unsupervised feature selection. In: IJCAI, pp 1621–1627Google Scholar

Copyright information

© Springer-Verlag London Ltd., part of Springer Nature 2019

Authors and Affiliations

  1. 1.College of Continuing EducationGuangxi Teachers Education UniversityNanningPeople’s Republic of China

Personalised recommendations