Advertisement

Sparse Matrix Feature Selection in Multi-label Learning

  • Wenyuan Yang
  • Bufang Zhou
  • William ZhuEmail author
Conference paper
Part of the Lecture Notes in Computer Science book series (LNCS, volume 9437)

Abstract

High-dimensional data are commonly met in multi-label learning, and dimensionality reduction is an important and challenging work. In this paper, we propose sparse matrix feature selection to reduce data dimension in multi-label learning. First, the feature selection problem is formalized by sparse matrix. Second, an sparse matrix feature selection algorithm is proposed. Third, four feature selection are compared with the proposed methods and parameter optimization analysis is also provide. Experiments reported the proposed algorithms outperform the other methods in most cases of tested datasets.

Keywords

Multi-label learning feature selection sparse matrix machine learning 

Notes

Acknowledgments

This work is in part supported by National Nature Science Foundation of China under Grant Nos. 61170128 and 61379049, the Key Project of Education Department of Fujian Province under Grant No. JA13192, the Zhangzhou Municipal Natural Science Foundation under Grant No. ZZ2013J03, and the Minnan Normal University Doctoral Research Foundation under Grant No. 2004L21424.

References

  1. 1.
    Abajo, E., Dinez, A.: Graphs with maximum size and lower bounded girth. Appl. Math. Lett. 25(3), 575–579 (2012)MathSciNetCrossRefGoogle Scholar
  2. 2.
    Cheng, W., Hllermeier, E.: Combining instance-based learning and logistic regression for multilabel classification. Mach. Learn. 76, 211–225 (2009)CrossRefGoogle Scholar
  3. 3.
    Guyon, I., Elisseeff, A.: An introduction to variable and feature selection. J. Mach. Learn. Res. 3, 1157–1182 (2003)zbMATHGoogle Scholar
  4. 4.
    Guyon, I., Elisseeff, A. (eds.): Feature Extraction: Foundations and Applications. Studies in Fuzziness and Soft Computing, vol. 207. Springer, Heidelberg (2006)Google Scholar
  5. 5.
    Kira, K., Rendell, L.A.: A Practical Approach to Feature Selection. In: International Conference on Machine Learning, pp. 249–256 (1992)CrossRefGoogle Scholar
  6. 6.
    Kononenko, I.: Estimating attributes: analysis and extensions of RELIEF. In: Bergadano, Francesco, De Raedt, Luc (eds.) ECML 1994. LNCS, vol. 784, pp. 171–182. Springer, Heidelberg (1994) Google Scholar
  7. 7.
    Liu, H., Motoda, H.: Computational Methods of Feature Selection. Chapman & Hall, London (2008)zbMATHGoogle Scholar
  8. 8.
    Nie, F., Huang, H., Cai, X., Ding, C.: Efficient and robust feature selection via joint \(\ell _{2,1}\)-norms minimization. In: Advances in Neural Information Processing Systems, pp. 1813–1821 (2010)Google Scholar
  9. 9.
    Zhang, M., Zhou, Z.: ML-KNN: a lazy learning approach to multi-label learning. Pattern Recogn. 40, 2038–2048 (2007)CrossRefGoogle Scholar
  10. 10.
    Zhang, M., Zhou, Z.: A review on multi-label learning algorithms. IEEE Trans. Knowl. Data Eng. 26(8), 1819–1837 (2014)CrossRefGoogle Scholar
  11. 11.
    Zhao, Z., Wang, L., Liu, H., Ye, J.: On similarity preserving feature selection. IEEE Trans. Knowl. Data Eng. 25(3), 619–632 (2013)CrossRefGoogle Scholar

Copyright information

© Springer International Publishing Switzerland 2015

Open Access This chapter is licensed under the terms of the Creative Commons Attribution-NonCommercial 2.5 International License (http://creativecommons.org/licenses/by-nc/2.5/), which permits any noncommercial use, sharing, adaptation, distribution and reproduction in any medium or format, as long as you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons license and indicate if changes were made.

The images or other third party material in this chapter are included in the chapter's Creative Commons license, unless indicated otherwise in a credit line to the material. If material is not included in the chapter's Creative Commons license and your intended use is not permitted by statutory regulation or exceeds the permitted use, you will need to obtain permission directly from the copyright holder.

Authors and Affiliations

  1. 1.Lab of Granular ComputingMinnan Normal UniversityZhangzhouChina

Personalised recommendations