Extreme Learning Machine for Supervised Classification with Self-paced Learning


The extreme learning machine (ELM), a typical machine learning algorithm based on feedforward neural network, has been widely used in classification, clustering, regression and feature learning. However, the traditional ELM learns all samples at once, and sample weights of traditional methods are defined before the learning process and they will not change during the learning process. So, its performance is vulnerable to noisy data and outliers, finding a way to solve this problem is meaningful. In this work, we propose a model of self-paced ELM named SP-ELM for binary classification and multi-classification originated from the self-paced learning paradigm. Concretely, the algorithm takes the importance of samples into account according to the loss of predicted value and real value, and it establishes the model from the simple samples to complex samples. By setting certain restrictions, the influence of complex data on the model is reduced. Four different self-paced regularization terms are adopted in the paper to select the instances. Experimental results demonstrate the effectiveness and of the proposed method by comparing it with other improved ELMs.

This is a preview of subscription content, log in to check access.

Fig. 1
Fig. 2
Fig. 3
Fig. 4
Fig. 5
Fig. 6
Fig. 7


  1. 1.

    Krawczyk B, Woźniak M, Schaefer G (2014) Cost-sensitive decision tree ensembles for effective imbalanced classification. Appl Soft Comput 14:554–562. https://doi.org/10.1016/j.asoc.2013.08.014

    Article  Google Scholar 

  2. 2.

    Farid DM, Zhang L, Rahman CM, Hossain MA, Strachan R (2014) Hybrid decision tree and naïve Bayes classifiers for multi-class classification tasks. Expert Syst Appl 41(4):1937–1946. https://doi.org/10.1016/j.eswa.2013.08.089

    Article  Google Scholar 

  3. 3.

    Lindenbaum M, Markovitch S, Rusakov D (2004) Selective sampling for nearest neighbor classifiers. Mach Learn 54(2):125–152

    Article  Google Scholar 

  4. 4.

    Zhu X, Li X, Zhang S, Ju C, Wu X (2016) Robust joint graph sparse coding for unsupervised spectral feature selection. IEEE Trans Neural Netw Learn Syst 28(6):1263–1275

    MathSciNet  Article  Google Scholar 

  5. 5.

    Frye D, Zelazo PD, Palfai T (1995) Theory of mind and rule-based reasoning. Cogn Dev 10(4):483–527

    Article  Google Scholar 

  6. 6.

    Tong S, Koller D (2001) Support vector machine active learning with applications to text classification. J Mach Learn Res 2:45–66

    MATH  Google Scholar 

  7. 7.

    Hu R, Zhu X, Zhu Y, Gan J (2019) Robust SVM with adaptive graph learning. World Wide Web 23:1–24

    Google Scholar 

  8. 8.

    Jain AK, Mao J, Mohiuddin KM (1996) Artificial neural networks: a tutorial. Computer 29(3):31–44. https://doi.org/10.1109/2.485891

    Article  Google Scholar 

  9. 9.

    Parkavi R, Shanthi M, Bhuvaneshwari M, Bhuvaneshwari MC (2017) Recent trends in ELM and MLELM: A review. Adv Sci, Technol Eng Syst J 2:69–75

    Article  Google Scholar 

  10. 10.

    Huang G-B, Zhu Q-Y, Siew C-K (2006) Extreme learning machine: theory and applications. Neurocomputing 70(1–3):489–501. https://doi.org/10.1016/j.neucom.2005.12.126

    Article  Google Scholar 

  11. 11.

    Guang-Bin H, Qin-Yu Z, Chee-Kheong S (2004) Extreme learning machine: a new learning scheme of feedforward neural networks. In: Paper presented at the 2004 IEEE international joint conference on neural networks

  12. 12.

    Scardapane S, Comminiello D, Scarpiniti M, Uncini A (2015) Online sequential extreme learning machine with kernels. IEEE Trans Neural Netw Learn Syst 26(9):2214–2220. https://doi.org/10.1109/TNNLS.2014.2382094

    MathSciNet  Article  Google Scholar 

  13. 13.

    Alade OA, Selamat A, Sallehuddin R (2017) A review of advances in extreme learning machine techniques and its applications. In: International conference of reliable information and communication technology. Springer, pp 885–895

  14. 14.

    Jiang L, Meng D, Zhao Q, Shan S, Hauptmann AG (2015) Self-paced curriculum learning. In: Twenty-ninth AAAI conference on artificial intelligence

  15. 15.

    Xu X, Shen F, Yang Y, Shen HT, Li X (2017) Learning discriminative binary codes for large-scale cross-modal retrieval. IEEE Trans Image Process 26(5):2494–2507

    MathSciNet  Article  Google Scholar 

  16. 16.

    Shen HT, Liu L, Yang Y, Xu X, Huang Z, Shen F, Hong R (2020) Exploiting subspace relation in semantic labels for cross-modal hashing. IEEE Trans Knowl Data Eng 1–15

  17. 17.

    Meng D, Zhao Q, Jiang L (2017) A theoretical understanding of self-paced learning. Inf Sci 414:319–328. https://doi.org/10.1016/j.ins.2017.05.043

    Article  Google Scholar 

  18. 18.

    Zhao Q, Meng D, Jiang L, Xie Q, Xu Z, Hauptmann AG (2015) Self-paced learning for matrix factorization. In: Twenty-ninth AAAI conference on artificial intelligence

  19. 19.

    Zhu X, Yang J, Zhang C, Zhang S (2019) Efficient utilization of missing data in cost-sensitive learning. IEEE Trans Knowl Data Eng 1–12

  20. 20.

    Tang Y, Yang Y-B, Gao Y (2012) Self-paced dictionary learning for image classification. In: Proceedings of the 20th ACM international conference on multimedia, ACM, pp 833–836

  21. 21.

    Lee YJ, Grauman K (2011) Learning the easy things first: self-paced visual category discovery. In: CVPR 2011. IEEE, pp 1721–1728

  22. 22.

    Supancic JS, Ramanan D (2013) Self-paced learning for long-term tracking. In: Proceedings of the IEEE conference on computer vision and pattern recognition, pp 2379–2386

  23. 23.

    Zhou Y, Tian L, Zhu C, Jin X, Sun Y (2019) Video coding optimization for virtual reality 360-degree source. IEEE J Selected Top Signal Process 14:118–129

    Article  Google Scholar 

  24. 24.

    Jiang L, Meng D, Mitamura T, Hauptmann AG (2014) Easy samples first: self-paced reranking for zero-example multimedia search. In: Proceedings of the 22nd ACM international conference on multimedia, ACM, pp 547–556

  25. 25.

    Li L, Sun R, Cai S, Zhao K, Zhang Q (2019) A review of improved extreme learning machine methods for data stream classification. Multimedia Tools Appl 2:1–26. https://doi.org/10.1007/s11042-019-7543-2

    Article  Google Scholar 

  26. 26.

    Huang G, Huang GB, Song S, You K (2015) Trends in extreme learning machines: a review. Neural Netw 61:32–48. https://doi.org/10.1016/j.neunet.2014.10.001

    Article  MATH  Google Scholar 

  27. 27.

    Zhai J-h, Xu H-y, Wang X-z (2012) Dynamic ensemble extreme learning machine based on sample entropy. Soft Comput 16(9):1493–1502. https://doi.org/10.1007/s00500-012-0824-6

    Article  Google Scholar 

  28. 28.

    Li B, Rong X, Li Y (2014) An improved kernel based extreme learning machine for robot execution failures. The Sci World J 2014:1–7

    MathSciNet  Google Scholar 

  29. 29.

    Luo J, Vong C-M, Wong P-K (2013) Sparse Bayesian extreme learning machine for multi-classification. IEEE Trans Neural Netw Learn Syst 25(4):836–843

    Google Scholar 

  30. 30.

    Zhang L, Zhang D (2017) Evolutionary cost-sensitive extreme learning machine. IEEE Trans Neural Netw Learn Syst 28(12):3045–3060. https://doi.org/10.1109/TNNLS.2016.2607757

    MathSciNet  Article  Google Scholar 

  31. 31.

    Zhang Y, Cai Z, Wu J, Wang X, Liu X (2015) A memetic algorithm based extreme learning machine for classification. In: 2015 international joint conference on neural networks (IJCNN), IEEE, pp 1–8

  32. 32.

    Shuliang X, Junhong W (2016) A fast incremental extreme learning machine algorithm for data streams classification. Expert Syst Appl 65:332–344. https://doi.org/10.1016/j.eswa.2016.08.052

    Article  Google Scholar 

  33. 33.

    Zhu Q-Y, Qin AK, Suganthan PN, Huang G-B (2005) Evolutionary extreme learning machine. Pattern Recogn 38(10):1759–1763. https://doi.org/10.1016/j.patcog.2005.03.028

    Article  MATH  Google Scholar 

  34. 34.

    Cao J, Lin Z, Huang G-B, Liu N (2012) Voting based extreme learning machine. Inf Sci 185(1):66–77

    MathSciNet  Article  Google Scholar 

  35. 35.

    Zhang D, Meng D, Li C, Jiang L, Zhao Q, Han J (2015) A self-paced multiple-instance learning framework for co-saliency detection. In: Proceedings of the IEEE international conference on computer vision, pp 594–602

  36. 36.

    Li C, Wei F, Yan J, Zhang X, Liu Q, Zha H (2017) A self-paced regularization framework for multilabel learning. IEEE Trans Neural Netw Learn Syst 29(6):2660–2666. https://doi.org/10.1109/TNNLS.2017.2697767

    MathSciNet  Article  Google Scholar 

  37. 37.

    Lin L, Wang K, Meng D, Zuo W, Zhang L (2017) Active self-paced learning for cost-effective and progressive face identification. IEEE Trans Pattern Anal Mach Intell 40(1):7–19. https://doi.org/10.1109/TPAMI.2017.2652459

    Article  Google Scholar 

  38. 38.

    Ren Y, Zhao P, Sheng Y, Yao D, Xu Z (2017) Robust softmax regression for multi-class classification with self-paced learning. In: Proceedings of the 26th international joint conference on artificial intelligence. AAAI Press, pp 2641–2647

  39. 39.

    Pi T, Li X, Zhang Z, Meng D, Wu F, Xiao J, Zhuang Y (2016) Self-paced boost learning for classification. In: IJCAI, pp 1932–1938

  40. 40.

    Li H, Gong M (2017) Self-paced convolutional neural networks. In: IJCAI, pp 2110–2116

  41. 41.

    Zheng W, Zhu X, Wen G, Zhu Y, Yu H, Gan J (2018) Unsupervised feature selection by self-paced learning regularization. Pattern Recognit Lett 132:4–11

    Article  Google Scholar 

  42. 42.

    Gan J, Wen G, Yu H, Zheng W, Lei C (2018) Supervised feature selection by self-paced learning regression. Pattern Recognit Lett. https://doi.org/10.1016/j.patrec.2018.08.029

    Article  Google Scholar 

  43. 43.

    Jiang L, Meng D, Yu S-I, Lan Z, Shan S, Hauptmann A (2014) Self-paced learning with diversity. In: Ghahramani Z, Welling W, Cortes C, Lawrence ND, Weinberger KQ (eds) Advances in neural information processing systems. Curran Associates, Inc., pp 2078–2086.

  44. 44.

    Kumar MP, Packer B, Koller D (2010) Self-paced learning for latent variable models. In: Lafferty JD, Williams CKI, Shawe-Taylor J, Zemel RS, Culotta A (eds) Advances in neural information processing systems. Curran Associates, Inc., pp 1189–1197

  45. 45.

    Jiang L, Meng D, Mitamura T, Hauptmann A (2014) Easy samples first: self-paced reranking for zero-example multimedia search. Proc 22nd ACM Int Conf Multimedia. https://doi.org/10.1145/2647868.2654918

    Article  Google Scholar 

  46. 46.

    Li H, Gong M, Meng D, Miao Q (2016) Multi-objective self-paced learning. In: Thirtieth AAAI conference on artificial intelligence

Download references

Author information



Corresponding author

Correspondence to Ruizhi Sun.

Additional information

Publisher's Note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Rights and permissions

Reprints and Permissions

About this article

Verify currency and authenticity via CrossMark

Cite this article

Li, L., Zhao, K., Li, S. et al. Extreme Learning Machine for Supervised Classification with Self-paced Learning. Neural Process Lett (2020). https://doi.org/10.1007/s11063-020-10286-9

Download citation


  • Classification
  • Extreme learning machine
  • Self-paced learning
  • Accuracy