Advertisement

Cluster Computing

, Volume 22, Supplement 3, pp 7435–7445 | Cite as

A novel online incremental and decremental learning algorithm based on variable support vector machine

  • Yuantao ChenEmail author
  • Jie Xiong
  • Weihong Xu
  • Jingwen Zuo
Article

Abstract

In view of the long execution time and low execution efficiency of Support Vector Machine in large-scale training samples, the paper has proposed the online incremental and decremental learning algorithm based on variable support vector machine (VSVM). In deep understanding of the operation mechanism and correlation algorithms for VSVM, each sample has increased training datasets changes and it needs to update the classifier of learning algorithm. Firstly, they are given the online growth amount of learning algorithm taken full advantage of the incremental pre-calculated information, and doesn’t require retraining for the new incremental training datasets. Secondly, the incremental matrix inverse calculation process had greatly reduced the running time of algorithm, and it is given in order to verify out the validity of the online learning algorithm. Finally, the nine groups of datasets in the standard library have been selected in the pattern classification experiment. The experimental results are shown that the online learning algorithm given in the case to ensure the correct classification rates and effective training’s speed. With the implementation of the incremental process, training meetings, the need for large-scale data storage space, result in slow training, the online learning algorithm based on VSVM can solve the problem.

Keywords

Variable support vector machine Classification Online incremental and decremental learning algorithm Inverse matrix 

Notes

Acknowledgements

This work is supported by the National Natural Science Foundation of China (Nos. 61772087, 61702052), the Science and Technology Service Platform of Hunan Province (No. 2012TP1001), the Open Research Fund of Hunan Provincial Key Laboratory of Intelligent Processing of Big Data on Transportation (No. 2015TP1005), the Changsha Science and Technology Planning (Nos. KQ1703018, KQ1706064), the Research Foundation of Education Bureau of Hunan Province (Nos. 12C0010, 17A007), the ZOOMLION Intelligent Technology Limited Company (No. 2017zkhx130). We are grateful to anonymous referees for useful comments and suggestions.

References

  1. 1.
    Ratsaby, J.: Incremental learning with sample queries. IEEE Trans. Pattern Anal. Mach. Intell. 20(8), 883–888 (1998)CrossRefGoogle Scholar
  2. 2.
    Chen, Y.T., Xu, W,H., Kuang, F.J. et al.: The research and application of visual saliency and adaptive support vector machine in target tracking field. Comput. Math. Methods Med. 2013, 8 (2013). Article ID 925341,  https://doi.org/10.1155/2013/925341 zbMATHGoogle Scholar
  3. 3.
    Syed, N.A., Liu, H., Sung, K.K.: Incremental learning with support vector machines. In: Proceedings of International Joint Conference on Artificial Intelligence (IJCAI-99) (1999)Google Scholar
  4. 4.
    Cauwenberghs, G., Paggio, T.: Incremental and decremental support vector machine learning. In: Proceedings of Advanced Neural Information Processing, MIT Press (2001)Google Scholar
  5. 5.
    Ralaivola, L., D’Alche-Buc, F.: Inremental support vector machine learning: a local approach. In: Proceedings of International Conference on Artificial Neural Networks, ICANN’2001, Vienne, Autriche (2001)CrossRefGoogle Scholar
  6. 6.
    Mangasarian, O.L., Solodov, M.V.: Nonlinear complementarity as unconstrained and constrained minimization. Math. Progr. Ser. B 62, 277–297 (1993)MathSciNetCrossRefGoogle Scholar
  7. 7.
    Mangasarian, O.L., Musicant, D.R.: Active support vector machines classification. In: Advances in Neural Information Processing Systems (NIPS 2000) (2000)Google Scholar
  8. 8.
    Mangasarian, O.L., Musicant, D.R.: Successive overrelaxation for support vector machines. IEEE Trans. Neural Netw. 10, 1032–1037 (1999)CrossRefGoogle Scholar
  9. 9.
    Liu, H.P., Liu, Y.P., Sun, F.C.: Robust exemplar extraction using structured sparse coding. IEEE Trans. Neural Netw. Learn. Syst. 26(8), 1816–1821 (2015)MathSciNetCrossRefGoogle Scholar
  10. 10.
    Zhou, Y.C., Wang, X.H., Wang, T., Liu, B.Y., Sun, W.X.: Fault-tolerant multi-path routing protocol for WSN based on HEED. Int. J. Sens. Netw. 20(1), 37–44 (2016)CrossRefGoogle Scholar
  11. 11.
    Liu, H.P., Sun, F.C., Fang, B., Zhang, X.Y.: Robotic room-level localization using multiple sets of sonar measurements. IEEE Trans. Instrum. Meas. 66(1), 2–13 (2017)CrossRefGoogle Scholar
  12. 12.
    Zhou, Y.C., Wang, T., Wang, Y.F.: A novel WSN key pre-distribution scheme based on group-deployment. Int. J. Sens. Netw. 15(3), 143–148 (2014)MathSciNetCrossRefGoogle Scholar
  13. 13.
    Liu, H.P., Guo, D., Sun, F.C.: Object recognition using tactile measurements: kernel sparse coding methods. IEEE Trans. Instrum. Meas. 65(3), 656–665 (2016)CrossRefGoogle Scholar
  14. 14.
    Cai, Q.F., Hao, Z.F., Yang, X.W.: Gaussian kernel-based fuzzy inference systems for high dimensional regression. Neurocomputing 77(1), 197–204 (2012)CrossRefGoogle Scholar
  15. 15.
    Yang, X.W., Yu, Q.Z., He, L.F.: The one-against-all partition based binary tree support vector machine algorithms for multi-class classification. Neurocomputing 77(2), 307–314 (2012)Google Scholar
  16. 16.
    He, L.F., Hao, Z.F., Yang, X.W., Xiao, F.Z., Lv, H.R.: An optimal grid-clustering based decision tree support vector machine algorithm for large-scale classification problems. Neurocomputing 77(3), 507–514 (2012)Google Scholar
  17. 17.
    Lee, Y.J., Mangasarian, O.L.: SSVM: a smooth support vector machines for classification. Comput. Optim. Appl. 20(1), 5–22 (2000)MathSciNetCrossRefGoogle Scholar
  18. 18.
    Standard pattern classification library. http://ida.first.gmd.de/raetsch/data/benchmarks.htm
  19. 19.
    Machine learning database. http://www.ics.uci.edu
  20. 20.
    Itti, L., Baldi, P.: Bayesian surprise attracts human attention. Vis. Res. 49(10), 1295–1306 (2009)CrossRefGoogle Scholar
  21. 21.
    Itti, L., Kouch, C.: Computational modeling of visual attention. Nat. Rev. Neurosci. 2(3), 194–230 (2001)CrossRefGoogle Scholar
  22. 22.
    Judd, T., Ehinger, K., Durand, F. et al.: Learning to predict where humans look. In: Proceedings of IEEE International Conference on Computer Vision, pp. 230–242 (2009)Google Scholar
  23. 23.
    Hou, X.D., Harel, J., Koch, C.: Image signature: highlighting sparse salient regions. IEEE Trans. Pattern Anal. Mach. Intell. 34(1), 194–201 (2012)CrossRefGoogle Scholar
  24. 24.
    Stan Birchfield’s datasets in Stanford University. http://vision.stanford.edu/~birch/headtracker/

Copyright information

© Springer Science+Business Media, LLC, part of Springer Nature 2018

Authors and Affiliations

  • Yuantao Chen
    • 1
    Email author
  • Jie Xiong
    • 3
  • Weihong Xu
    • 1
  • Jingwen Zuo
    • 2
  1. 1.Hunan Provincial Key Laboratory of Intelligent Processing of Big Data on Transportation & School of Computer and Communicational EngineeringChangsha University of Science and TechnologyChangshaPeople’s Republic of China
  2. 2.Computer Center, College of ChengNanChangsha University of Science and TechnologyChangshaPeople’s Republic of China
  3. 3.School of Computer ScienceYangtze UniversityJingzhouPeople’s Republic of China

Personalised recommendations