Skip to main content

Advertisement

Log in

Bat algorithm-based back propagation approach for short-term load forecasting considering weather factors

  • Original Paper
  • Published:
Electrical Engineering Aims and scope Submit manuscript

Abstract

This paper proposes a Bat algorithm-based back propagation approach for solving the short-term load forecasting considering the weather factors such as temperature and humidity. The accuracy of load forecasting is very important in the restructured power system as it plays a major role in providing a better cost-effective risk management and operation plans. Load/demand forecasting is difficult due to the nonlinear and randomness properties load itself, its dependency on factors like weather conditions, variations of social and economic environments, and price of the electrical power in deregulated environment. Load forecasting accuracy significantly impacts the cost of power utilities in operational planning of the energy supply. To show the effectiveness of the proposed approach, the PJM (Pennsylvania–New Jersey–Maryland) system load demand data are considered. The simulation results obtained have shown that the day-ahead hourly forecasts of load demand using the proposed method is very accurate with very less error and well forecasted.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Fig. 1
Fig. 2
Fig. 3

Similar content being viewed by others

References

  1. Xia F, Fan L (2012) Application of artificial neural network (ANN) for prediction of power load. Adv Int Soft Comput 115:673–677

    Article  Google Scholar 

  2. Koponen P, Mutanen A, Niska H (2014) Assessment of some methods for short-term load forecasting. Europe, Istanbul. In: IEEE PES innovative smart grid technologies, pp 1–6

  3. Ray P, Sen S, Barisal AK (2014) Hybrid methodology for short-term load forecasting. In: IEEE international conference on power electronics, drives and energy systems, Mumbai, pp. 1–6

  4. Koo BG, Lee HS, Park J (2014) A study on short-term electric load forecasting using wavelet transform. Europe, Istanbul. In: IEEE PES innovative smart grid technologies, pp 1–6

  5. Kuo SS, Lee CM, Ko CN (2014) Hybrid learning algorithm based neural networks for short-term load forecasting. In: International conference on fuzzy theory and its applications, Kaohsiung, pp. 105–110

  6. Pandey AK, Sahay KB, Tripathi MM, Chandra D (2014) Short-term load forecasting of UPPCL using ANN. In: 6th IEEE power india international conference, Delhi, pp. 1–6

  7. Dudek G (2015) Short-term load cross-forecasting using pattern-based neural models. In: 16th International scientific conference on electric power engineering, Kouty nad Desnou, pp. 179–183

  8. Ishik MY, Göze T, Özcan İ, Güngör VÇ, Aydın Z (2015)Short term electricity load forecasting: a case study of electric utility market in Turkey. In: 3rd International Istanbul smart grid congress and fair, Istanbul, pp 1–5

  9. Patel H, Pandya M, Aware M (2015) Short term load forecasting of Indian system using linear regression and artificial neural network. In: 5th Nirma University international conference on engineering, Ahmedabad, pp 1–5

  10. Luthuli QW, Folly KA (2016) Short term load forecasting using artificial intelligence. Livingstone. In: IEEE PES Power Africa, pp 129–133

  11. Sahay KB, Sahu S, Singh P (2016) Short-term load forecasting of Toronto Canada by using different ANN algorithms. In: IEEE 6th international conference on power systems, New Delhi, pp 1–6

  12. Sreekumar S, Verma J, Sujil A, Kumar, R (2016) New short term load forecasting models based on growth rate scaling and simple averaging. In: IEEE 6th international conference on power systems, New Delhi, pp 1–6

  13. Bećirović E, Ćosović M (2016) Machine learning techniques for short-term load forecasting. In: 4th International symposium on environmental friendly energies and applications. Serbia, Belgrade

  14. Liu J, Zhao J, Ouyang Y, Wang B, Liu Y, Ouyang H, Hao Q, Lu Y (2016) Short-term load forecasting based on parallel frameworks. In: 12th International conference on natural computation, fuzzy systems and knowledge discovery, Changsha, pp 1474–1478

  15. Ribeiro GT, Gritti MC, Ayala HVH, Mariani VC, Coelho LDS (2016) Short-term load forecasting using wavenet ensemble approaches. In: International joint conference on neural networks, Vancouver, BC, Canada, pp 727–734

  16. Khuntia SR, Rueda JL, van der Meijden MAMM (2016) Neural network-based load forecasting and error implication for short-term horizon. In: International joint conference on neural networks, Vancouver, BC, Canada, pp 4970–4975

  17. Ghofrani M, Carson D, Ghayekhloo M (2016) Hybrid clustering-time series-bayesian neural network short-term load forecasting method. Denver, CO, USA, North American Power Symposium, pp 1–5

  18. Hu R, Wen S, Zeng Z, Huang T (2017) A short-term power load forecasting model based on the generalized regression neural network with decreasing step fruit fly optimization algorithm. Neurocomputing 221:24–31

    Article  Google Scholar 

  19. Wu J, Wang J, Lu H, Dong Y, Lu X (2013) Short term load forecasting technique based on the seasonal exponential adjustment method and the regression model. Energy Convers Manag 70:1–9

    Article  Google Scholar 

  20. Bahrami S, Hooshmand RA, Parastegari M (2014) Short term electric load forecasting by wavelet transform and grey model improved by PSO (particle swarm optimization) algorithm. Energy 72:434–442

    Article  Google Scholar 

  21. Zjavka L, Snášel V (2016) Short-term power load forecasting with ordinary differential equation substitutions of polynomial networks. Electr Power Syst Res 137:113–123

    Article  Google Scholar 

  22. Abdoos A, Hemmati M, Abdoos AA (2015) Short term load forecasting using a hybrid intelligent method. Knowl Based Syst 76:139–147

    Article  Google Scholar 

  23. Khwaja AS, Zhang X, Anpalagan A, Venkatesh B (2017) Boosted neural networks for improved short-term electric load forecasting. Electr Power Syst Res 143:431–437

    Article  Google Scholar 

  24. Dudek G (2016) Neural networks for pattern-based short-term load forecasting: a comparative study. Neurocomputing 205:64–74

    Article  Google Scholar 

  25. Li S, Goel L, Wang P (2016) An ensemble approach for short-term load forecasting by extreme learning machine. Appl Energy 170:22–29

    Article  Google Scholar 

  26. Dudek G (2016) Pattern based local linear regression models for short-term load forecasting. Electr Power Syst Res 130:139–147

    Article  Google Scholar 

  27. Ghofrani M, Ghayekhloo M, Arabali A, Ghayekhloo A (2015) A hybrid short-term load forecasting with a new input selection framework. Energy 81:777–786

    Article  Google Scholar 

  28. Ray P, Mishra D (2014) Artificial intelligence based fault location in a distribution system. In: International conference on information technology, Bhubaneswar, pp 18–23

  29. Ray P, Mishra DP, Lenka RK (2016) “Short term load forecasting by artificial neural network,” International Conference on Next Generation Intelligent Systems, Kottayam, pp. 1-6

  30. Ray P, Mishra DP, Panda DD (2015) Hybrid technique for fault location of a distribution line. In: Annual IEEE India conference, New Delhi, pp 1–6

  31. Li C, Yao L, Chen W, Li S (2015) Comments on nonlocal effects in nano-cantilever beams. Int J Eng Sci 87:47–57

    Article  Google Scholar 

  32. Erkmen I, Topalli AK (2003) Four methods for short-term load forecasting using the benefits of artificial intelligence. Electr Eng 85(4):229–233

    Article  Google Scholar 

  33. Rego L, Sumaili J, Miranda V, Francês C, Silva M, Santana Á (2016) Mean shift densification of scarce data sets in short-term electric power load forecasting for special days. Electr Eng. doi:10.1007/s00202-016-0424-z

  34. Niu D, Dai S (2017) A short-term load forecasting model with a modified particle swarm optimization algorithm and least squares support vector machine based on the denoising method of empirical mode decomposition and grey relational analysis. Energies 10(3):408

    Article  Google Scholar 

  35. Singh VV, Srivastava A (2014) An introduction to load forecasting: conventional and modern technologies. IRACST Eng Sci Technol Int J 4(2):62–66

    Google Scholar 

  36. Srivastava AK, Pandey AS, Singh D (2016) Short-term load forecasting methods: a review. In: International conference on emerging trends in electrical electronics & sustainable energy systems, Sultanpur, pp 130–138

  37. Rao GM, Narasimhaswamy I, Kumar BS (2010) Deregulated power system load forecasting using artificial intelligence. In: IEEE international conference on computational intelligence and computing research, Coimbatore, pp 1–5

  38. Contaxi E, Delkis C, Kavatza S, Vournas C (2006) The effect of humidity in a weather-sensitive peak load forecasting model. In: IEEE PES power systems conference and exposition. Atlanta, GA, pp 1528–1534

  39. Yang XS (2010) A new meta heuristic Bat-Inspired Algorithm, nature inspired cooperative strategies for optimization. In: Volume 284 of the series studies in computational intelligence, Springer Berlin Heidelberg, pp 65–74

  40. Available [Online]: PJM. Pennsylvania–New Jersey–Maryland market. http://www.pjm.com. Accessed 19 June 2017

  41. Reddy SS, Momoh JA (2014) Short term electrical load forecasting using back propagation neural networks. Pullman, WA, North American Power Symposium, pp 1–6

  42. Reddy SS, Jung CM (2016) Short-term load forecasting using artificial neural networks and wavelet transform. Int J Appl Eng Res 11(19):9831–9836

    Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to S. Surender Reddy.

Appendix: back propagation neural network(BPNN)

Appendix: back propagation neural network(BPNN)

Artificial neural networks (ANNs) have the capability of handling nonlinear dependencies between the load demands and the factors influencing it. ANN maps the input and output relationships by approximating the linear/nonlinear mathematical functions. The structure of an ANN can be classified into 3 groups as per the arrangement of neurons and the connection patterns of the layers, and they are feed forward (back propagation networks), feedback (recurrent neural networks and adaptive resonance memories) and self-organizing (Kohonen networks). In this paper, back propagation neural networks (BPNNs) are used, which utilizes the available input and output data and then adjusts weights with the help of some observable functions (i.e., loss functions). The FFNNs consist of three or more layers of nodes, i.e., one input layer, one output layer and one or more hidden layers [41].

BPNN is composed by one input layer and one or more hidden layers and one output layer. The learning process of NN includes two courses, one is the input information transmitting in forward direction and another is the error transmitting in backward direction. In the forward action, the input information goes to the hidden layers from input layer and goes to the output layer. If the output of output layer is different with the wishful output result then the output error will be calculated, the error will be transmitted backward direction and then the weights between the neurons of every layers will be modified in order to make the error as minimum as possible. Then, the network is said to be trained for the given data or application. Figure 4 depicts the structure of 3-layer BPNN. From this figure, it can be observed that ’i’ represents the input layer, ’j’ represents the hidden layer, ’k’ represents the output layer, ’I’ is the input and ’O’ is the output.

Fig. 4
figure 4

Structure of back propagation neural network (BPNN)

The input to jth neuron of hidden layer is expressed as [42],

$$\begin{aligned} \mathrm{net}_j =\mathop \sum \limits _i W_{ji} O_i \qquad i=1,2,3,\ldots ,N_I \end{aligned}$$
(9)

where \(N_I\) is the number of input nodes. The output of \(j{\mathrm{th}}\) neuron is expressed as,

$$\begin{aligned} O_j =g\left( {\mathrm{net}_j } \right) \end{aligned}$$
(10)

The input to \(k{\mathrm{th}}\) neuron of output layer is expressed as,

$$\begin{aligned} \mathrm{net}_k =\mathop \sum \limits _j W_{kj} O_j \qquad j=1,2,3,\ldots ,N_H \end{aligned}$$
(11)

where \(N_H\) is the number of hidden nodes. The output of \(k{\mathrm{th}}\) neuron is expressed as,

$$\begin{aligned} O_k =g\left( {\mathrm{net}_k } \right) \end{aligned}$$
(12)

where g is Sigmoid function, and it is expressed as,

$$\begin{aligned} g\left( x \right) =\frac{1}{1+e^{-x}} \end{aligned}$$
(13)

The principle of BPNN is the error back propagation during the learning process. The learning process is accomplished through minimization of an object function, which is the sum of squares of the errors between the actual output of network and the target output. Gradient descent algorithm is used to derive the computing formula. In the learning course, the target output of \(k{\mathrm{th}}\) neuron of output layer is \(T_{pk} \), the corresponding actual output of network is \(O_{pk} \) and the average of sum of squares of error of system is expressed as,

$$\begin{aligned} E=\frac{1}{2p}\mathop \sum \limits _p \mathop \sum \limits _k \left( {T_{pk} -O_{pk} } \right) ^{2}=\frac{1}{2}\mathop \sum \limits _k \left( {T_k -O_k } \right) \end{aligned}$$
(14)

where p is the number of training samples employed in training the network and E is the objective function. According to the gradient descent algorithm, we derive the increment or adjustment value of every weight and they are expressed as,

$$\begin{aligned} \Delta W_{kj}= & {} \eta \left( {T_k -O_k } \right) \delta _k \left( {1-O_k } \right) O_j \end{aligned}$$
(15)
$$\begin{aligned} \Delta W_{ji}= & {} \eta \delta _j O_i \end{aligned}$$
(16)

where \(\eta \) is the learning rate, and \(\delta _k\), \(\delta _j\) are expressed as,

$$\begin{aligned} \delta _k= & {} \left( {T_k -O_k } \right) O_k \left( {1-O_k } \right) \end{aligned}$$
(17)
$$\begin{aligned} \delta _j= & {} O_j \left( {1-O_j } \right) \mathop \sum \limits _k \delta _k W_{kj} \end{aligned}$$
(18)

If the rate of learning (\(\eta )\) is more, the adjustment or increment value of every weight will also become more, and this can accelerate the training process of the network, but this result can generate the oscillations. In order to avoid the oscillations due to the increased rate of learning (\(\eta )\), a momentum term is added, and it is expressed as,

$$\begin{aligned} \Delta W_{ji} \left( {n+1} \right) =\eta \delta _j O_i +\alpha \Delta W_{ji} \left( n \right) \end{aligned}$$
(19)

where \(\alpha \) is the proportionality constant or momentum factor. Through the back propagation network training, once the accuracy requirement is satisfied, then the interconnection weight between all nodes are ascertained or stored. This completes the training of BPNN. Then, the trained network can be used to identify the unknown sample. Usually, this part is termed as testing.

Rights and permissions

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Reddy, S.S. Bat algorithm-based back propagation approach for short-term load forecasting considering weather factors. Electr Eng 100, 1297–1303 (2018). https://doi.org/10.1007/s00202-017-0587-2

Download citation

  • Received:

  • Accepted:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s00202-017-0587-2

Keywords

Navigation