Improving the Training Pattern in Back-Propagation Neural Networks Using Holt-Winters’ Seasonal Method and Gradient Boosting Model
- 71 Downloads
In this paper, we propose an improved training pattern in back-propagation neural networks using Holt-Winters’ seasonal method and gradient boosting model (NHGB). It removes the errors that cause disabilities in the hidden layers of BPNN and further improves the predictive performance. It increases the weights and decays the error using Holt-Winters’ seasonal method and gradient boosting model, which reduces longer convergence time. The NHGB method is compared with other existing methods against average initial error, root mean square error, accuracy, sensitivity, and specificity metrics. The result shows that NHGB method is effective in terms of reduced RMSE and increased accuracy in classifying the datasets.
KeywordsBack-propagation neural network Artificial neural network Weight adjustment Holt-Winters’ seasonal method and gradient boosting model
- 7.Ye Z, Kim MK (2018) Predicting electricity consumption in a building using an optimized back-propagation and Levenberg–Marquardt back-propagation neural network: Case study of a shopping mall in China. Sustain Cities Soc 42:176–183; Fröhlinghaus T, Weichert A, Rujan P (1994) Hierarchical neural networks for time-series analysis and control. Netw Comput Neural Syst 5(1):101–116Google Scholar
- 8.Chen T, Guestrin C (2016) Xgboost: a scalable tree boosting system. In: Proceedings of the 22nd ACM SIGKDD international conference on knowledge discovery and data mining, ACM, pp 785–794Google Scholar
- 9.Friedman JH (2001) Greedy function approximation: a gradient boosting machine. Ann Stat: 1189–1232Google Scholar
- 11.Murphy PM (1992) UCI Repository of machine learning databases [Machine-readable data repository]. In: Technical report. Department of Information and Computer Science, University of CaliforniaGoogle Scholar