Skip to main content

Joint Subspace Learning and Sparse Regression for Feature Selection in Kernel Space

  • Conference paper
  • First Online:
Applications and Techniques in Information Security (ATIS 2018)

Part of the book series: Communications in Computer and Information Science ((CCIS,volume 950))

Abstract

In this paper, we propose a novel feature selection method to jointly map original data to kernel space and conduct both subspace learning (via locality preserving projection) and feature selection (via a sparsity constraint). The kernel method is used to explore the nonlinear relationship between data and subspace learning is used to maintain the local structure of the data. As a result, we eliminate redundant and irrelevant features and thus make our method select a large amount of informative and distinguishing features. By comparing our proposed method with some state-of-the-art methods, the experimental results showed that the proposed method outperformed the comparisons in terms of clustering task.

Supported by organization x.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 39.99
Price excludes VAT (USA)
  • Available as EPUB and PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 54.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

References

  1. Zheng, W., Zhu, X., Wen, G., Zhu, Y., Yu, H., Gan, J.: Unsupervised feature selection by self-paced learning regularization. Pattern Recogn. Lett. (2018). https://doi.org/10.1016/j.patrec.2018.06.029

  2. Zhu, X., Zhang, S., Hu, R., Zhu, Y., et al.: Local and global structure preservation for robust unsupervised spectral feature selection. IEEE Trans. Knowl. Data Eng. 30(3) pp. 517–529

    Article  Google Scholar 

  3. Li, Y., Zhang, J., Yang, L., Zhu, X., Zhang, S., Fang, Y.: Low-rank sparse subspace for spectral clustering. IEEE Trans. Knowl. Data Eng. https://doi.org/10.1109/TKDE.2018.2858782

  4. He, X., Cai, D., Niyogi, P.: Laplacian score for feature selection. In: International Conference on Neural Information Processing Systems, pp. 507–514 (2005)

    Google Scholar 

  5. Tabakhi, S., Moradi, P., Akhlaghian, F.: An unsupervised feature selection algorithm based on ant colony optimization. Eng. Appl. Artif. Intell. 32(6), 112–123 (2014)

    Article  Google Scholar 

  6. Cai, D., Zhang, C., He, X.: Unsupervised feature selection for multi-cluster data. In: ACM SIGKDD International Conference on Knowledge Discovery and Data Mining, pp. 333–342 (2010)

    Google Scholar 

  7. Zhu, X., Li, X., Zhang, S., Xu, Z., Yu, L., Wang, C.: Graph PCA hashing for similarity search. IEEE Trans. Multimed. 19(9), 2033–2044 (2017)

    Article  Google Scholar 

  8. Zhang, S., Li, X., Zong, M., Zhu, X., Wang, R.: Efficient kNN classification with different numbers of nearest neighbors. IEEE Trans. Neural Netw. Learn. Syst. 29(5), 1774–1785 (2018)

    Article  MathSciNet  Google Scholar 

  9. Cao, B., Shen, D., Sun, J.T., Yang, Q., Chen, Z.: Feature selection in a kernel space. In: Proceedings of the Twenty-Fourth International Conference on Machine Learning, pp. 121–128 (2007)

    Google Scholar 

  10. Hu, R., et al.: Graph self-representation method for unsupervised feature selection. Neurocomputing 220, 130–137 (2017)

    Article  Google Scholar 

  11. Baudat, G., Anouar, F.: Generalized discriminant analysis using a kernel approach. Neural Comput. 12(10), 2385–2404 (2000)

    Article  Google Scholar 

  12. Zhi, X., Yan, H., Fan, J., Zheng, S.: Efficient discriminative clustering via QR decomposition-based linear discriminant analysis. Knowl.-Based Syst. 153, 117–132 (2018)

    Article  Google Scholar 

  13. Rahmani, M., Atia, G.K.: Coherence pursuit: fast, simple, and robust principal component analysis. IEEE Trans. Sig. Process. 65(23), 6260–6275 (2017)

    Article  MathSciNet  Google Scholar 

  14. Zhu, L., Miao, L., Zhang, D.: Iterative Laplacian score for feature selection. In: Liu, C.-L., Zhang, C., Wang, L. (eds.) CCPR 2012. CCIS, vol. 321, pp. 80–87. Springer, Heidelberg (2012). https://doi.org/10.1007/978-3-642-33506-8_11

    Chapter  Google Scholar 

  15. Nie, F., Huang, H., Cai, X., Ding, C.: Efficient and robust feature selection via joint \(\ell \)\(_{2,1}\)-norms minimization. In: International Conference on Neural Information Processing Systems, pp. 1813–1821 (2010)

    Google Scholar 

  16. Nie, F., Zhu, W., Li, X.: Unsupervised feature selection with structured graph optimization. In: Thirtieth AAAI Conference on Artificial Intelligence, pp. 1302–1308 (2016)

    Google Scholar 

  17. Hou, C., Nie, F., Li, X., Yi, D., Wu, Y.: Joint embedding learning and sparse regression: a framework for unsupervised feature selection. IEEE Trans. Cybern. 44(6), 793–804 (2014)

    Article  Google Scholar 

  18. Lai, H., Pan, Y., Liu, C., Lin, L., Wu, J.: Sparse learning-to-rank via an efficient primal-dual algorithm. IEEE Trans. Comput. 62(6), 1221–1233 (2013)

    Article  MathSciNet  Google Scholar 

  19. Zhu, X., Li, X., Zhang, S., Ju, C., Wu, X.: Robust joint graph sparse coding for unsupervised spectral feature selection. IEEE Trans. Neural Netw. Learn. Syst. 28(6), 1263–1275 (2017)

    Article  MathSciNet  Google Scholar 

  20. Alamri, A.A.: Theory and methodology on the global optimal solution to a General Reverse Logistics Inventory Model for deteriorating items and time-varying rates. Comput. Ind. Eng. 60(2), 236–247 (2011)

    Article  Google Scholar 

  21. Zheng, W., Zhu, X., Zhu, Y., Hu, R., Lei, C.: Dynamic graph learning for spectral feature selection. Multimed. Tools Appl. (2017). https://doi.org/10.1007/s11042-017-5272-y

  22. Baudat, G., Anouar, F.: Feature vector selection and projection using kernels. Neurocomputing 55(1), 21–38 (2003)

    Article  Google Scholar 

  23. Zhu, X., Zhu, Y., Zhang, S., Hu, R., He, W.: Adaptive hypergraph learning for unsupervised feature selection. In: Twenty-Sixth International Joint Conference on Artificial Intelligence, pp. 3581–3587 (2017)

    Google Scholar 

  24. Zhu, X., Zhang, S., Hu, R., Zhu, Y., Song, J.: Local and global structure preservation for robust unsupervised spectral feature selection. IEEE Trans. Knowl. Data Eng. 30(3), 517–529 (2018)

    Article  Google Scholar 

  25. Gu, Q., Li, Z., Han, J.: Linear discriminant dimensionality reduction. In: Gunopulos, D., Hofmann, T., Malerba, D., Vazirgiannis, M. (eds.) ECML PKDD 2011. LNCS (LNAI), vol. 6911, pp. 549–564. Springer, Heidelberg (2011). https://doi.org/10.1007/978-3-642-23780-5_45

    Chapter  Google Scholar 

  26. Zhu, X., Zhang, L., Huang, Z.: A sparse embedding and least variance encoding approach to hashing. IEEE Trans. Image Process. 23(9), 3737–3750 (2014)

    Article  MathSciNet  Google Scholar 

  27. Zhu, X., Suk, H.-I., Huang, H., Shen, D.: Low-rank graph-regularized structured sparse regression for identifying genetic biomarkers. IEEE Trans. Big Data 3(4), 405–414 (2017)

    Article  Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Zhi Zhong .

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2018 Springer Nature Singapore Pte Ltd.

About this paper

Check for updates. Verify currency and authenticity via CrossMark

Cite this paper

Chen, L., Zhong, Z. (2018). Joint Subspace Learning and Sparse Regression for Feature Selection in Kernel Space. In: Chen, Q., Wu, J., Zhang, S., Yuan, C., Batten, L., Li, G. (eds) Applications and Techniques in Information Security. ATIS 2018. Communications in Computer and Information Science, vol 950. Springer, Singapore. https://doi.org/10.1007/978-981-13-2907-4_13

Download citation

  • DOI: https://doi.org/10.1007/978-981-13-2907-4_13

  • Published:

  • Publisher Name: Springer, Singapore

  • Print ISBN: 978-981-13-2906-7

  • Online ISBN: 978-981-13-2907-4

  • eBook Packages: Computer ScienceComputer Science (R0)

Publish with us

Policies and ethics