Advertisement

Multi-class Semi-supervised Logistic I-RELIEF Feature Selection Based on Nearest Neighbor

  • Baige Tang
  • Li ZhangEmail author
Conference paper
Part of the Lecture Notes in Computer Science book series (LNCS, volume 11440)

Abstract

The multi-class semi-supervised logistic I-RELEIF (MSLIR) algorithm has been proposed and showed its feature selection ability using both labeled and unlabeled samples. Unfortunately, MSLIR is poor when predicting labels for unlabeled samples. To solve this issue, this paper presents a novel multi-class semi-supervised logistic I-RELEIF based on nearest neighbor (MSLIR-NN) for multi-class feature selection tasks. To generate better margin vectors for unlabeled samples, MSLIR-NN uses the nearest neighbor scheme to first predict the labels of unlabeled samples and then calculates their margin vectors according to these estimated labels. Experimental results demonstrate that MSLIR-NN can improve the prediction accuracy of unlabeled data.

Keywords

Logistic I-RELIEF Feature selection Multi-class classification Semi-supervised Nearest neighbor 

Notes

Acknowledgements

This work was supported in part by the National Natural Science Foundation of China under Grant No. 61373093, by the Soochow Scholar Project of Soochow University, and by the Six Talent Peak Project of Jiangsu Province of China.

References

  1. 1.
    Guyon, I., Elisseeff, A.: An introduction to variable and feature selection. J. Mach. Learn. Res. 3(6), 1157–1182 (2003)zbMATHGoogle Scholar
  2. 2.
    Zhao, Z., Wang, L., Liu, H., Ye, J.: On similarity preserving feature selection. IEEE Trans. Knowl. Data Eng. 25(3), 619–632 (2013)CrossRefGoogle Scholar
  3. 3.
    Benabdeslem, K., Hindawi, M.: Efficient semi-supervised feature selection: constraint, relevance, and redundancy. IEEE Trans. Knowl. Data Eng. 26(5), 1131–114326 (2014)CrossRefGoogle Scholar
  4. 4.
    Zhang, D., Chen, S., Zhou, Z.H.: Constraint score: a new filter method for feature selection with pairwise constraints. Pattern Recogn. 41(5), 1440–1451 (2008)CrossRefGoogle Scholar
  5. 5.
    Sheikhpour, R., Sarram, M.A., Gharaghani, S., et al.: A survey on semi-supervised feature selection methods. Pattern Recogn. 64(C), 141–158 (2016)Google Scholar
  6. 6.
    Kira, K., Rendell, L.A.: The feature selection problem: traditional methods and a new algorithm. In: Tenth National Conference on Artificial Intelligence, pp. 129–134 (1992)Google Scholar
  7. 7.
    Kononenko, I.: Estimating attributes: analysis and extensions of RELIEF. In: European Conference on Machine Learning on Machine Learning, pp. 171–182 (1994)Google Scholar
  8. 8.
    Sun, Y.: Iterative RELIEF for feature weighting: algorithms, theories, and applications. IEEE Trans. Pattern Anal. Mach. Intell. 29, 1035–1051 (2007)CrossRefGoogle Scholar
  9. 9.
    Cheng, Z.D., Zhang, Y.J., Fan, X., Zhu, B.: Study on discriminant matrices of commonly used fisher discriminant functions. Acta Autom. Sinica 36(10), 1361–1370 (2010)MathSciNetCrossRefGoogle Scholar
  10. 10.
    Chen, L.F., Liao, H.Y.M., Ko, M.T., Lin, J.C., Yu, G.J.: A new LDA based face recognition system which can solve the small sample size problem. Pattern Recogn. 33(10), 1713–1726 (2000)CrossRefGoogle Scholar
  11. 11.
    Peng, H., Long, F., Ding, C.: Feature selection based on mutual information: criteria of max-dependency, max-relevance, and min-redundancy. IEEE Comput. Soc. 27(8), 1226 (2005)Google Scholar
  12. 12.
    Mitra, P., Murthy, C.A., Pal, S.K.: Unsupervised feature selection using feature similarity. IEEE Trans. Pattern Anal. Mach. Intell. 24(3), 301–312 (2002)CrossRefGoogle Scholar
  13. 13.
    He, X., Cai, D., Niyogi, P.: Laplacian score for feature selection. In: International Conference on Neural Information Processing Systems, vol. 18, pp. 507–514 (2005)Google Scholar
  14. 14.
    Bishop, C.M.: Pattern Recognition and Machine Learning. Springer, New York (2006)zbMATHGoogle Scholar
  15. 15.
    Zeng, Z., Wang, X.D., Zhang, J., Wu, Q.: Semi-supervised feature selection based on local discriminative information. Neurocomputing 173(P1), 102–109 (2016)CrossRefGoogle Scholar
  16. 16.
    Cheng, Y., Cai, Y., Sun, Y., Li, J.: Semi-supervised feature selection under the Logistic I-RELIEF framework. In: International Conference on Pattern Recognition, pp. 1–4 (2008)Google Scholar
  17. 17.
    Zhao, Z., Liu, H.: Semi-supervised feature selection via spectral analysis. In: SIAM International Conference on Data Mining, SIAM 2007, pp. 641–646. SIAM, Minneapolis (2007)Google Scholar
  18. 18.
    Xu, J., Tang, B., He, H., Man, H.: Semi-supervised feature selection based on relevance and redundancy criteria. IEEE Trans. Neural Netw. Learn. Syst. 28(9), 1974–1984 (2016)CrossRefGoogle Scholar
  19. 19.
    Tang, B., Zhang, L.: Semi-supervised feature selection based on logistic I-RELIEF for multi-classification. In: Geng, X., Kang, B.-H. (eds.) PRICAI 2018. LNCS (LNAI), vol. 11012, pp. 719–731. Springer, Cham (2018).  https://doi.org/10.1007/978-3-319-97304-3_55CrossRefGoogle Scholar
  20. 20.
    UCI Machine Learning Repository. http://archive.ics.uci.edu/ml/datasets.html

Copyright information

© Springer Nature Switzerland AG 2019

Authors and Affiliations

  1. 1.School of Computer Science and Technology, Joint International Research Laboratory of Machine Learning and Neuromorphic ComputingSoochow UniversitySuzhouChina
  2. 2.Provincial Key Laboratory for Computer Information Processing TechnologySoochow UniversitySuzhouChina

Personalised recommendations