Memetic Computing

, Volume 10, Issue 1, pp 43–52 | Cite as

Kernel online sequential ELM algorithm with sliding window subject to time-varying environments

Regular Research Paper

Abstract

Extreme learning machine (ELM) is an emerging machine learning algorithm with single-hidden-layer feedforward neural networks (SLFNs). The key strength of ELM algorithm is the significantly fast training speed and good generalization performance since the learning parameters of hidden nodes are generated randomly. The kernel online sequential ELM (KOS-ELM) is a straightforward extension of the well-known recursive least-squares method to the ELM framework. KOS-ELM is a good choice for the online learning of stationary applications. However, there are lots of situations and applications which are time varying quickly. It is unreasonable and inaccurate to pay equal emphasis on both old and new observations. In this paper, we proposed a modified KOS-ELM algorithm with forgetting mechanism (KOS-ELMF) to deal with the time-sensitive applications. A sliding window is applied to limit the active training data in order to ’forget’ the old observations. The size of the sliding window can change based on the forecast error automatically. The automatic determination of model parameters can avoid human interference and save training time. Empirical study of KOS-ELMF on several benchmark applications shows that the proposed approach achieves more satisfied and robust performance, compared with other ELM-related algorithms.

Keywords

Extreme learning machine Kernel function Time-sensitive application Online sequential learning 

Notes

Acknowledgements

This work has been supported by the National Natural Science Foundation of China (NSFC Grant No. 61333002 and No. 61673056).

References

  1. 1.
    Huang GB, Zhu QY, Siew CK (2006) Extreme learning machine: theory and applications. Neurocomputing 70(1–3):489–501CrossRefGoogle Scholar
  2. 2.
    Cao J, Zhang K, Luo M, Yin C, Lai X (2016) Extreme learning machine and adaptive sparse representation for image classification. Neural Networks 81:91–102CrossRefGoogle Scholar
  3. 3.
    Huang GB (2014) An insight into Extreme learning machines: random neurons, random features and kernels. Cogn Comput 6(3):376–390CrossRefGoogle Scholar
  4. 4.
    Liang NY, Huang GB, Saratchandran P, Sundararajan N (2006) A Fast and accurate online sequential learning algorithm for feedforward networks. IEEE Trans Neural Networks 17(6):1411–1423CrossRefGoogle Scholar
  5. 5.
    Lu HJ, Du BJ, Liu JY, Xia HX, Yeap WK (2016) A kernel extreme learning machine algorithm based on improved particle swam optimization. Memetic Computing, 1–8Google Scholar
  6. 6.
    Zhou XR, Liu ZJ, Zhu CX (2014) Online regularized and kernelized extreme learning machines with forgetting mechanism. Math Prob Eng. doi: 10.1155/2014/93854
  7. 7.
    Huang GB, Zhou H, Ding X, Zhang R (2012) Extreme learning machine for regression and multiclass classification. IEEE Trans Syst Man Cyber Part B Cyber 42(2):513–529Google Scholar
  8. 8.
    Wang XY, Han M (2014) Online sequential extreme learning machine with kernels for nonstationary time series prediction. Neurocomputing 145(5):90–97CrossRefGoogle Scholar
  9. 9.
    Zhou XR, Wang CS (2016) Cholesky factorization based online regularized and kernelized extreme learning machines with forgetting mechanism. Neurocomputing 174((B)):1147–1155CrossRefGoogle Scholar
  10. 10.
    Scardapane S, Comminiello D, Scarpiniti M, Uncini A (2015) Online sequential extreme learning machine with kernels. IEEE Trans Neural Networks Learn Syst 26(9):2214–2220MathSciNetCrossRefGoogle Scholar
  11. 11.
    Zhao JW, Wang ZH, Park DS (2012) Online sequential extreme learning machine with forgetting mechanism. Neurocomputing 87:79–89CrossRefGoogle Scholar
  12. 12.
    Cao JW, Chen T, Fan J (2016) Landmark recognition with compact BoW histogram and ensemble ELM. Multi Tools Appl 75(5):2839–2857CrossRefGoogle Scholar
  13. 13.
    Zhang HG, Zhang S, Yin YX (2016) An improved ELM algorithm for the measurement of hot metal temperature in blast furnace. Neurocomputing 174((A)):232–237CrossRefGoogle Scholar
  14. 14.
    Ye Y, Squartini S, Piazza F (2012) Online sequential extreme learning machine in nonstationary environments. Neurocomputing 116:94–101CrossRefGoogle Scholar
  15. 15.
    Sayed AH (2003) Fundamentals of adaptive filtering. Wiley, HobokenGoogle Scholar
  16. 16.
    Kivinen J, Smola AJ, Williamson RC (2004) Online learning with kernels. IEEE Trans Signal Process 52(8):2165–2176MathSciNetCrossRefMATHGoogle Scholar
  17. 17.
    Prncipe JC, Liu W, Haykin S (2010) Kernel adaptive filtering: a comprehensive introduction. Wiley, HobokenGoogle Scholar
  18. 18.
    Vaerenbergh SV, Via J, Santamaria I (2006) A sliding-window kernel PLS algorithm and its application to nonlinear channel identification. IEEE Int Conf Acoust Speech Signal ProcessGoogle Scholar
  19. 19.
    Engel Y, Mannor S, Meir R (2004) The kernel recursive least-squares algorithm. IEEE Trans Signal process 52(8):2275–2285MathSciNetCrossRefMATHGoogle Scholar
  20. 20.
    Narendra KS, Parthasarathy K (1990) Identification and control of dynamical systems using neural networks. IEEE Trans Neural Networks 1(1):4C–27Google Scholar
  21. 21.
    Efe MO, Abadoglu E, Kaynak O (1999) A novel analysis and design of a neural network assisted nonlinear controller for a bioreactor. Int J Robust Nonlinear Contr 9(11):799-C815MathSciNetCrossRefMATHGoogle Scholar

Copyright information

© Springer-Verlag Berlin Heidelberg 2016

Authors and Affiliations

  1. 1.School of Automation and Electrical EngineeringUniversity of Science and Technology BeijingBeijingChina

Personalised recommendations