Advertisement

A Novel Scheme for Recruitment Text Categorization Based on KNN Algorithm

  • Wenshuai Qin
  • Wenjie Guo
  • Xin LiuEmail author
  • Hui Zhao
Conference paper
Part of the Lecture Notes in Computer Science book series (LNCS, volume 11910)

Abstract

With the rapid development of the Internet, online recruitment has gradually become mainstream. However, job seekers need to spend a lot of time to find a suitable job when there are a large variety of job information, which will seriously affect their efficiency, so it is necessary to carry out more detailed and efficient classification of the recruitment documents. Currently, common text classification algorithms include KNN (k-Nearest Neighbor), SVM (Support Vector Machine) and Naïve Bayes. Particularly, KNN algorithm is widely used in text classification for its simple implementation and accurate classification. But KNN algorithm has been criticized for its inefficiency in the face of large-scale recruitment. This paper improves the original KNN algorithm and proposes RS-KNN algorithm to achieve rapid refinement and classification of recruitment information. Experiments show that the improved algorithm has higher efficiency and less time consumption than the original algorithm.

Keywords

KNN Text classification Feature extraction Job classification 

Notes

Acknowledgement

This work is supported by Postgraduate Education Innovation and Quality Improvement Project of Henan University, Henan University (SYL18020105).

References

  1. 1.
    Gai, K., Qiu, M.: Reinforcement learning-based content-centric services in mobile sensing. IEEE Netw. 32(4), 34–39 (2018)CrossRefGoogle Scholar
  2. 2.
    Gai, K., Xu, K., Lu, Z., Qiu, M., Zhu, L.: Fusion of cognitive wireless networks and edge computing. IEEE Wirel. Commun. 26(3), 69–75 (2019)CrossRefGoogle Scholar
  3. 3.
    Gai, K., Qiu, M., Zhao, H.: Energy-aware task assignment for mobile cyber-enabled applications in heterogeneous cloud computing. J. Parallel Distrib. Comput. 111, 126–135 (2018)CrossRefGoogle Scholar
  4. 4.
    Yin, H., Gai, K., Wang, Z.: A classification algorithm based on ensemble feature selections for imbalanced-class dataset. In: 2016 IEEE 2nd International Conference on Big Data Security on Cloud, pp. 245–249. IEEE (2016)Google Scholar
  5. 5.
    Yin, H., Gai, K.: An empirical study on preprocessing high-dimensional class-imbalanced data for classification. In: 2015 IEEE 17th International Conference on High Performance Computing and Communications, 2015 IEEE 7th International Symposium on Cyberspace Safety and Security, and 2015 IEEE 12th International Conference on Embedded Software and Systems, pp. 1314–1319. IEEE (2015)Google Scholar
  6. 6.
    Wang, Y., Chaib-draa, B.: KNN-based Kalman filter: an efficient and non-stationary method for gaussian process regression. Knowl.-Based Syst. 114, 148–155 (2016)CrossRefGoogle Scholar
  7. 7.
    Tong, S., Koller, D.: Support vector machine active learning with applications to text classification. J. Mach. Learn. Res. 2(Nov), 45–66 (2001)Google Scholar
  8. 8.
    Lai, S., Xu, L., Liu, K., Zhao, J.: Recurrent convolutional neural networks for text classification. In: Twenty-Ninth AAAI Conference on Artificial Intelligence (2015)Google Scholar
  9. 9.
    Yuan, X., Sun, M., Chen, Z., Gao, J., Li, P.: Semantic clustering-based deep hypergraph model for online reviews semantic classification in cyber-physical-social systems. IEEE Access 6, 17942–17951 (2018)CrossRefGoogle Scholar
  10. 10.
    Yang, K., Cai, Y., Cai, Z., Xie, H., Wong, T., Chan, W.: Top k representative: a method to select representative samples based on k nearest neighbors. Int. J. Mach. Learn. Cybern. 10, 1–11 (2017)Google Scholar
  11. 11.
    Gu, S., Cheng, R., Jin, Y.: Feature selection for high-dimensional classification using a competitive swarm optimizer. Soft Comput. 22(3), 811–822 (2018)CrossRefGoogle Scholar
  12. 12.
    Bojanowski, P., Grave, E., Joulin, A., Mikolov, T.: Enriching word vectors with subword information. Trans. Assoc. Comput. Linguist. 5, 135–146 (2017)CrossRefGoogle Scholar
  13. 13.
    Yang, H., Cui, H., Tang, H.: A text classification algorithm based on feature weighting. In: AIP Conference Proceedings, vol. 1864, p. 020026. AIP Publishing (2017)Google Scholar
  14. 14.
    Heydon, A., Najork, M.: Mercator: a scalable, extensible web crawler. World Wide Web 2(4), 219–229 (1999)CrossRefGoogle Scholar
  15. 15.
    Goetz, B.: The lucene search engine: powerful, flexible, and free. JavaWorld (2000). http://www.javaworld.com/javaworld/jw-09-2000/jw-0915-lucene.html
  16. 16.
    Carpenter, B.: Lingpipe for 99.99% recall of gene mentions. In: Proceedings of the 2nd BioCreative Challenge Evaluation Workshop, vol. 23, pp. 307–309. BioCreative (2007)Google Scholar
  17. 17.
    Fienberg, S.: The use of chi-squared statistics for categorical data problems. J. Roy. Stat. Soc.: Ser. B(Methodol.) 41(1), 54–64 (1979)MathSciNetzbMATHGoogle Scholar
  18. 18.
    Bennasar, M., Hicks, Y., Setchi, R.: Feature selection using joint mutual information maximisation. Expert Syst. Appl. 42(22), 8520–8532 (2015)CrossRefGoogle Scholar
  19. 19.
    Wang, X., et al.: Research and implementation of a multi-label learning algorithm for Chinese text classification. In: 2017 3rd International Conference on Big Data Computing and Communications (BIGCOM), pp. 68–76. IEEE (2017)Google Scholar
  20. 20.
    Ma, Y., Li, Y., Wu, X., Zhang, X.: Chinese text classification review. In: 2018 9th International Conference on Information Technology in Medicine and Education (ITME), pp. 737–739. IEEE (2018)Google Scholar
  21. 21.
    Zhao, Y., Qian, Y., Li, C.: Improved KNN text classification algorithm with MapReduce implementation. In: 2017 4th International Conference on Systems and Informatics (ICSAI), pp. 1417–1422. IEEE (2017)Google Scholar

Copyright information

© Springer Nature Switzerland AG 2019

Authors and Affiliations

  1. 1.School of SoftwareHenan UniversityKaifengChina
  2. 2.School of Computer Science and TechnologyBeijing Institute of TechnologyBeijingChina

Personalised recommendations