Advertisement

An Improved Double Hidden-Layer Variable Length Incremental Extreme Learning Machine Based on Particle Swarm Optimization

  • Qiuwei Li
  • Fei HanEmail author
  • Qinghua Ling
Conference paper
Part of the Lecture Notes in Computer Science book series (LNCS, volume 10955)

Abstract

Extreme learning machine (ELM) has been widely used in diverse domains. With the development of deep learning, integrating ELM with some deep learning method has become a new perspective method for extracting and classifications. However, it may require a large number of hidden nodes and lead to the ill-condition problem for its random generation. In this paper, an effective hybrid approach based on Variable-length Incremental ELM and Particle Swarm Optimization (PSO) algorithm (PSO-VIELM) is proposed which can be used to regulate weights and extract features. In the new method, we build two hidden layers to establish a structure which is compact with a better generalization performance. In the first hidden layer named extraction layer, we make the feature learning to the raw data, and make dynamic updates for hidden layer nodes, and use the fitting error as the fitness function to update the weights corresponding to the hidden nodes with the method of PSO. In the second hidden layer named classification layer, we make a classification for the processed data from extraction layer and use cross-entropy as the fitness function to update the weights in the net. In order to find the appropriate number of hidden layer nodes, all hidden nodes will no longer grow in the case of a rebound in the fitness function on the validation set. The result in some datasets shows that PSO-VIELM has a better generalization performance than other constructive ELMs.

Keywords

Extreme Learning Machine Particle Swarm Optimization Feature extraction Auto-encoder 

Notes

Acknowledgements

This work was supported by the National Natural Science Foundation of China [Nos. 61572241 and 61271385], the National Key R&D Program of China [No. 2017YFC0806600], the Foundation of the Peak of Six Talents of Jiangsu Province [No. 2015-DZXX-024], the Fifth 333 High Level Talented Person Cultivating Project of Jiangsu Province [No. (2016) III-0845], and the Research Innovation Program for College Graduates of Jiangsu Province [1291170030].

References

  1. 1.
    Miche, Y., Sorjamaa, A., Bas, P., Simula, O., Jutten, C., Lendasse, A.: Brief papers OP-ELM: optimally pruned extreme learning machine. IEEE Trans. Neural Netw. 21(1), 158–162 (2010)CrossRefGoogle Scholar
  2. 2.
    Huang, G.B., Zhu, Q.Y., Siew, C.K.: Extreme learning machine: a new learning scheme of feedforward neural networks. In: IEEE International Joint Conference on Neural Networks, vol. 2, pp. 985–990 (2004)Google Scholar
  3. 3.
    Huang, G.B., Zhu, Q.Y., Siew, C.K.: Extreme learning machine: theory and applications. Neurocomputing 70(1–3), 489–501 (2006)CrossRefGoogle Scholar
  4. 4.
    Huang, G.B., Siew, C.K.: Extreme learning machine: RBF network case. In: Control, Automation, Robotics and Vision Conference, pp. 1029–1036 (2004)Google Scholar
  5. 5.
    Huang, G.B., Chen, L., Siew, C.K.: Universal approximation using incremental constructive feedforward networks with random hidden nodes. IEEE Trans. Neural Netw. 17(4), 879 (2006)CrossRefGoogle Scholar
  6. 6.
    Kennedy, J., Eberhart, R.: Particle swarm optimization. In: IEEE International Conference on Neural Networks, vol. 4, pp. 1942–1948 (2002)Google Scholar
  7. 7.
    Xu, Y., Shu, Y.: Evolutionary extreme learning machine – based on particle swarm optimization. In: Wang, J., Yi, Z., Zurada, J.M., Lu, B.-L., Yin, H. (eds.) ISNN 2006. LNCS, vol. 3971, pp. 644–652. Springer, Heidelberg (2006).  https://doi.org/10.1007/11759966_95CrossRefGoogle Scholar
  8. 8.
    Zhao, G., Shen, Z., Miao, C., Gay, R.: Enhanced extreme learning machine with stacked generalization. In: IEEE International Joint Conference on Neural Networks, pp. 1191–1198 (2008)Google Scholar
  9. 9.
    Zhang, R., Lan, Y., Huang, G.B., Xu, Z.B.: Universal approximation of extreme learning machine with adaptive growth of hidden nodes. IEEE Trans. Neural Netw. Learn. Syst. 23(2), 365 (2012)CrossRefGoogle Scholar
  10. 10.
    Yang, Y., Wang, Y., Yuan, X.: Parallel chaos search based incremental extreme learning machine. Neural Process. Lett. 37(3), 277–301 (2013)CrossRefGoogle Scholar
  11. 11.
    Bastien, F., Lamblin, P., Pascanu, R., Bergstra, J., Goodfellow, I., Bergeron, A., Bouchard, N., Wardefarley, D., Bengio, Y.: Theano: new features and speed improvements. Computer Science (2012)Google Scholar
  12. 12.
    Sun, K., Zhang, J., Zhang, C., Hu, J.: Generalized extreme learning machine autoencoder and a new deep neural network. Neurocomputing 230, 374–381 (2016)CrossRefGoogle Scholar

Copyright information

© Springer International Publishing AG, part of Springer Nature 2018

Authors and Affiliations

  1. 1.School of Computer Science and Communication EngineeringJiangsu UniversityZhenjiangChina
  2. 2.School of Computer Science and EngineeringJiangsu University of Science and TechnologyZhenjiangChina

Personalised recommendations