Skip to main content

Improving the Training Pattern in Back-Propagation Neural Networks Using Holt-Winters’ Seasonal Method and Gradient Boosting Model

  • Chapter
  • First Online:
Applications of Machine Learning

Part of the book series: Algorithms for Intelligent Systems ((AIS))

Abstract

In this paper, we propose an improved training pattern in back-propagation neural networks using Holt-Winters’ seasonal method and gradient boosting model (NHGB). It removes the errors that cause disabilities in the hidden layers of BPNN and further improves the predictive performance. It increases the weights and decays the error using Holt-Winters’ seasonal method and gradient boosting model, which reduces longer convergence time. The NHGB method is compared with other existing methods against average initial error, root mean square error, accuracy, sensitivity, and specificity metrics. The result shows that NHGB method is effective in terms of reduced RMSE and increased accuracy in classifying the datasets.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 139.00
Price excludes VAT (USA)
  • Available as EPUB and PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 179.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info
Hardcover Book
USD 179.99
Price excludes VAT (USA)
  • Durable hardcover edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

References

  1. Hill T, Marquez L, O’Connor M, Remus W (1994) Artificial neural network models for forecasting and decision making. Int J Forecast 10(1):5–15

    Article  Google Scholar 

  2. Wang L, Zeng Y, Chen T (2015) Back propagation neural network with adaptive differential evolution algorithm for time series forecasting. Expert Syst Appl 42(2):855–863

    Article  Google Scholar 

  3. Zeng YR, Zeng Y, Choi B, Wang L (2017) Multifactor-influenced energy consumption forecasting using enhanced back-propagation neural network. Energy 127:381–396

    Article  Google Scholar 

  4. Bai Y, Li Y, Wang X, Xie J, Li C (2016) Air pollutants concentrations forecasting using back propagation neural network based on wavelet decomposition with meteorological conditions. Atmos Pollut Res 7(3):557–566

    Article  Google Scholar 

  5. Mason C, Twomey J, Wright D, Whitman L (2018) Predicting engineering student attrition risk using a probabilistic neural network and comparing results with a back propagation neural network and logistic regression. Res Higher Educ 59(3):382–400

    Article  Google Scholar 

  6. Sun W, Wang Y (2018) Short-term wind speed forecasting based on fast ensemble empirical mode decomposition, phase space reconstruction, sample entropy and improved back-propagation neural network. Energy Convers Manag 157:1–12

    Article  Google Scholar 

  7. Ye Z, Kim MK (2018) Predicting electricity consumption in a building using an optimized back-propagation and Levenberg–Marquardt back-propagation neural network: Case study of a shopping mall in China. Sustain Cities Soc 42:176–183; Fröhlinghaus T, Weichert A, Rujan P (1994) Hierarchical neural networks for time-series analysis and control. Netw Comput Neural Syst 5(1):101–116

    Google Scholar 

  8. Chen T, Guestrin C (2016) Xgboost: a scalable tree boosting system. In: Proceedings of the 22nd ACM SIGKDD international conference on knowledge discovery and data mining, ACM, pp 785–794

    Google Scholar 

  9. Friedman JH (2001) Greedy function approximation: a gradient boosting machine. Ann Stat: 1189–1232

    Google Scholar 

  10. Friedman J, Hastie T, Tibshirani R (2000) Additive logistic regression: a statistical view of boosting (with discussion and a rejoinder by the authors). Ann Stat 28(2):337–407

    Article  Google Scholar 

  11. Murphy PM (1992) UCI Repository of machine learning databases [Machine-readable data repository]. In: Technical report. Department of Information and Computer Science, University of California

    Google Scholar 

  12. Zhao X, Han M, Ding L, Calin AC (2018) Forecasting carbon dioxide emissions based on a hybrid of mixed data sampling regression model and back propagation neural network in the USA. Environ Sci Pollut Res 25(3):2899–2910

    Article  Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to S. Brilly Sangeetha .

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2020 Springer Nature Singapore Pte Ltd.

About this chapter

Check for updates. Verify currency and authenticity via CrossMark

Cite this chapter

Brilly Sangeetha, S., Wilfred Blessing, N.R., Yuvaraj, N., Adeline Sneha, J. (2020). Improving the Training Pattern in Back-Propagation Neural Networks Using Holt-Winters’ Seasonal Method and Gradient Boosting Model. In: Johri, P., Verma, J., Paul, S. (eds) Applications of Machine Learning. Algorithms for Intelligent Systems. Springer, Singapore. https://doi.org/10.1007/978-981-15-3357-0_13

Download citation

  • DOI: https://doi.org/10.1007/978-981-15-3357-0_13

  • Published:

  • Publisher Name: Springer, Singapore

  • Print ISBN: 978-981-15-3356-3

  • Online ISBN: 978-981-15-3357-0

  • eBook Packages: EngineeringEngineering (R0)

Publish with us

Policies and ethics