Adaptive Graph Learning for Supervised Low-Rank Spectral Feature Selection

  • Zhi ZhongEmail author
Conference paper
Part of the Communications in Computer and Information Science book series (CCIS, volume 950)


Spectral feature selection (SFS) is getting more and more attention in recent years. However, conventional SFS has some weaknesses that may corrupt the performance of feature selection, since (1) SFS generally preserves the either global structure or local structure, which can’t provide comprehensive information for the model; (2) graph learning and feature selection of SFS is two individual processes, which is hard to achieve the global optimization. Thus, a novel SFS is proposed via introducing a low-rank constraint for capturing inherent structure of data, and utilizing an adaptive graph learning to couple the graph learning and feature data learning in an iterative framework to output a robust and accurate learning model. A optimization algorithm is proposed to solve the proposed problem with a fast convergence. By comparing to some classical and first-class feature selection methods, our method has exhibited a competitive performance.


Low-rank constraint Spectral feature selection Adaptive graph learning 


  1. 1.
    Cai, X., Ding, C., Nie, F., Huang, H.: On the equivalent of low-rank linear regressions and linear discriminant analysis based regressions. In: ACM SIGKDD, pp. 1124–1132 (2013)Google Scholar
  2. 2.
    Cai, X., Nie, F., Huang, H.: Exact top-k feature selection via l 2, 0-norm constraint. In: International Joint Conference on Artificial Intelligence, pp. 1240–1246 (2013)Google Scholar
  3. 3.
    Chang, X., Nie, F., Yang, Y., Huang, H.: A convex formulation for semi-supervised multi-label feature selection. In: AAAI, pp. 1171–1177 (2014)Google Scholar
  4. 4.
    Daubechies, I., DeVore, R.A., Fornasier, M., Güntürk, C.S.: Iteratively re-weighted least squares minimization: proof of faster than linear rate for sparse recovery. In: CISS, pp. 26–29 (2008)Google Scholar
  5. 5.
    Hu, R., et al.: Graph self-representation method for unsupervised feature selection. Neurocomputing 220, 130–137 (2017)CrossRefGoogle Scholar
  6. 6.
    Nie, F., Zhu, W., Li, X.: Unsupervised feature selection with structured graph optimization. In: AAAI, pp. 1302–1308 (2016)Google Scholar
  7. 7.
    Nie, F., Zhu, W., Li, X.: Unsupervised feature selection with structured graph optimization. In: Thirtieth AAAI Conference on Artificial Intelligence, pp. 1302–1308 (2016)Google Scholar
  8. 8.
    Qian, M., Zhai, C.: Robust unsupervised feature selection. In: IJCAI, pp. 1621–1627 (2013)Google Scholar
  9. 9.
    Li, Y., Zhang, J., Yang, L., Zhu, X., Zhang, S., Fang, Y.: Low-rank sparse subspace for spectral clustering. IEEE Trans. Knowl. Data Eng.
  10. 10.
    Zhang, S., Li, X., Zong, M., Zhu, X., Wang, R.: Efficient kNN classification with different numbers of nearest neighbors. IEEE Trans. Neural Netw. Learn. Syst. 29(5), 1774–1785 (2018)MathSciNetCrossRefGoogle Scholar
  11. 11.
    Zheng, W., Zhu, X., Wen, G., Zhu, Y., Yu, H., Gan, J.: Unsupervised feature selection by self-paced learning regularization. Pattern Recogn. Lett. (2018).
  12. 12.
    Zheng, W., Zhu, X., Zhu, Y., Hu, R., Lei, C.: Dynamic graph learning for spectral feature selection. Multimed. Tools Appl. (2017).
  13. 13.
    Zhu, P., Zuo, W., Zhang, L., Hu, Q., Shiu, S.C.K.: Unsupervised feature selection by regularized self-representation. Pattern Recogn. 48(2), 438–446 (2015)CrossRefGoogle Scholar
  14. 14.
    Zhu, X., Huang, Z., Yang, Y., Shen, H.T., Xu, C., Luo, J.: Self-taught dimensionality reduction on the high-dimensional small-sized data. Pattern Recogn. 46(1), 215–229 (2013)CrossRefGoogle Scholar
  15. 15.
    Zhu, X., Li, X., Zhang, S., Ju, C., Wu, X.: Robust joint graph sparse coding for unsupervised spectral feature selection. IEEE Trans. Neural Netw. Learn. Syst. 28(6), 1263–1275 (2017)MathSciNetCrossRefGoogle Scholar
  16. 16.
    Zhu, X., Li, X., Zhang, S., Xu, Z., Yu, L., Wang, C.: Graph PCA hashing for similarity search. IEEE Trans. Multimed. 19(9), 2033–2044 (2017)CrossRefGoogle Scholar
  17. 17.
    Zhu, X., Suk, H.-I., Huang, H., Shen, D.: Low-rank graph-regularized structured sparse regression for identifying genetic biomarkers. IEEE Trans. Big Data 3(4), 405–414 (2017)CrossRefGoogle Scholar
  18. 18.
    Zhu, X., Wu, X., Ding, W., Zhang, S.: Feature selection by joint graph sparse coding (2013)Google Scholar
  19. 19.
    Zhu, X., Zhang, S., Hu, R., Zhu, Y., et al.: Local and global structure preservation for robust unsupervised spectral feature selection. IEEE Trans. Knowl. Data Eng. 30(3), 517–529CrossRefGoogle Scholar
  20. 20.
    Zhu, Y., Kim, M., Zhu, X., Yan, J., Kaufer, D., Wu, G.: Personalized Diagnosis for Alzheimer’s Disease. In: Descoteaux, M., Maier-Hein, L., Franz, A., Jannin, P., Collins, D.L., Duchesne, S. (eds.) MICCAI 2017. LNCS, vol. 10435, pp. 205–213. Springer, Cham (2017). Scholar
  21. 21.
    Zhu, Y., Lucey, S.: Convolutional sparse coding for trajectory reconstruction. IEEE Trans. Pattern Anal. Mach. Intell. 37(3), 529–540 (2015)CrossRefGoogle Scholar
  22. 22.
    Zhu, Y., Zhu, X., Kim, M., Kaufer, D., Wu, G.: A novel dynamic hyper-graph inference framework for computer assisted diagnosis of neuro-diseases. In: Niethammer, M., et al. (eds.) IPMI 2017. LNCS, vol. 10265, pp. 158–169. Springer, Cham (2017). Scholar
  23. 23.
    Zhu, Y., Zhang, X., Hu, R., Wen, G.: Adaptive structure learning for low-rank supervised feature selection. Pattern Recogn. Lett. 109, 89–96 (2018)CrossRefGoogle Scholar
  24. 24.
    Zhu, Y., Zhong, Z., Cao, W., Cheng, D.: Graph feature selection for dementia diagnosis. Neurocomputing 195(C), 19–22 (2016)CrossRefGoogle Scholar

Copyright information

© Springer Nature Singapore Pte Ltd. 2018

Authors and Affiliations

  1. 1.College of Continuing EducationGuangxi Teachers Education UniversityNanningPeople’s Republic of China

Personalised recommendations