Skip to main content

Regression with Support Vector Machines and VGG Neural Networks

  • Conference paper
  • First Online:
The International Conference on Advanced Machine Learning Technologies and Applications (AMLTA2019) (AMLTA 2019)

Part of the book series: Advances in Intelligent Systems and Computing ((AISC,volume 921))

Abstract

In the area of machine learning, classification tasks have been well studied, while another important application of regression is not of the same level. In this paper, we propose parameterization and settings obtained from multiple experiments for traditional supervised machine learning of Support Vector Machine (SVM) and recently widely used deep unsupervised learning technology of Convolutional Neural Networks (CNN) based on Visual Geometry Group (VGGNet). In this study, different dataset used for regression task have been adopted. We have experimented on six data types obtained from the UCI Machine Learning Repository, and one converted handwritten image dataset from the MNIST. Accuracy of the regression results generated by the proposed models are validated with statistical methods of Mean Absolute Error (MAE) and R_square, i.e. coefficient of determination. Experimental results demonstrate that VGG has clear advantages over SVM in the cases of image recognition and attributes with strong correlation, and SVM performs better in the cases of discrete, irregular and weak correlation data than. By comparing the three kernel functions of SVM, it is found that in most cases, Rbt kernel function performs more effectively than Linear and Poly ones.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 169.00
Price excludes VAT (USA)
  • Available as EPUB and PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 219.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

References

  1. Clancey, W.J.: Classification Problem Solving. Stanford University, Stanford (1984)

    Google Scholar 

  2. Kotsiantis, S.B., Zaharakis, I., Pintelas, P.: Supervised machine learning: a review of classification techniques. Emerg. Artif. Intell. Appl. Comput. Eng. 160, 3–24 (2017)

    Google Scholar 

  3. Joachims, T.: Text categorization with support vector machines: learning with many relevant features. In: European Conference on Machine Learning, pp. 137–142. Springer, Heidelberg (1998)

    Google Scholar 

  4. Krizhevsky, A., Sutskever, I., Hinton, G.E.: ImageNet classification with deep convolutional neural networks. In: Advances in Neural Information Processing Systems, pp. 1097–1105 (2012)

    Google Scholar 

  5. Meehl, P.E.: Bootstraps taxometrics: solving the classification problem in psychopathology. Am. Psychol. 50(4), 266–275 (1995)

    Article  Google Scholar 

  6. LeCun, Y., Boser, B., Denker, J.S., Henderson, D., Howard, R.E., Hubbard, W., Jackel, L.D.: Backpropagation applied to handwritten zip code. Neural Comput. 1(4), 541–551 (1998)

    Article  Google Scholar 

  7. Simonyan, K., Zisserman, A.: Very Deep Convolutional Networks for Large-Scale Image Recognition (2015). arXiv preprint: arXiv:1409.1556v6

  8. UCI machine learning repository. http://www.ics.uci.edu/~mlearn/MLRepository.html

  9. The MNIST Database of Handwritten Digits. http://yann.lecun.com/exdb/mnist/

  10. Cortes, C., Vapnik, V.: Support-vector networks. Mach. Learn. 20(3), 273–297 (1995)

    MATH  Google Scholar 

  11. Tsanas, A., Xifara, A.: Accurate quantitative estimation of energy performance of residential buildings using statistical machine learning tools. Energy Build. 49, 560–567 (2012)

    Article  Google Scholar 

  12. Yeh, I.-C.: Modeling of strength of high performance concrete using artificial neural networks. Cem. Concr. Res. 28(12), 1797–1808 (1998)

    Article  Google Scholar 

  13. Candanedo, L.M., Feldheim, V., Deramaix, D.: Data driven prediction models of energy use of appliances in a low-energy house. Energy Build. 140, 81–97 (2017). ISSN 0378-7788

    Article  Google Scholar 

  14. Liang, X., Zou, T., Guo, B., Li, S., Zhang, H., Zhang, S., Huang, H., Chen, S.X.: Assessing Beijing’s PM2.5 pollution: severity, weather impact, APEC and winter heating. Proc. R. Soc. A 471, 20150257 (2015)

    Article  Google Scholar 

  15. Srivastava, N., Hinton, G., Krizhevsky, A., Sutskever, I., Salakhutdinov, R.: Dropout: a simple way to prevent neural networks from overfitting. J. Mach. Learn. Res. 15(1), 1929–1958 (2014)

    MathSciNet  MATH  Google Scholar 

  16. Zhou, F., Claire, Q., King, R.D.: Predicting the geographical origin of music. In: IEEE ICDM, pp. 1115–1120 (2014)

    Google Scholar 

  17. Tsanas, A., Little, M.A., McSharry, P.E., Ramig, L.O.: Accurate telemonitoring of Parkinson’s disease progression by non-invasive speech tests. IEEE Trans. Biomed. Eng. 57(4), 884–893 (2010)

    Article  Google Scholar 

  18. Little, M.A., McSharry, P.E., Hunter, E.J., Spielman, J., Ramig, L.O.: Suitability of dysphonia measurements for telemonitoring of Parkinson’s disease. IEEE Trans. Biomed. Eng. 56(4), 1015–1022 (2009)

    Article  Google Scholar 

  19. Little, M.A., McSharry, P.E., Roberts, S.J., Costello, D.A.E., Moroz, I.M.: Exploiting nonlinear recurrence and fractal scaling properties for voice disorder detection. BioMed. Eng. OnLine. 6, 23 (2007)

    Article  Google Scholar 

  20. Frost, J.: Regression analysis: how do I interpret R-squared and assess the goodness-of-fit. The Minitab Blog (2013)

    Google Scholar 

  21. Gentleman, R., Carey, V.J.: Unsupervised machine learning. In: Bioconductor Case Studies, pp. 137–157. Springer, New York (2008)

    Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Shaozhi Wu .

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2020 Springer Nature Switzerland AG

About this paper

Check for updates. Verify currency and authenticity via CrossMark

Cite this paper

Wu, S., Liu, C., Wang, Z., Wu, S., Xiao, K. (2020). Regression with Support Vector Machines and VGG Neural Networks. In: Hassanien, A., Azar, A., Gaber, T., Bhatnagar, R., F. Tolba, M. (eds) The International Conference on Advanced Machine Learning Technologies and Applications (AMLTA2019). AMLTA 2019. Advances in Intelligent Systems and Computing, vol 921. Springer, Cham. https://doi.org/10.1007/978-3-030-14118-9_30

Download citation

Publish with us

Policies and ethics