Augmented Online Sequential Quaternion Extreme Learning Machine


Online sequential extreme learning machine (OS-ELM) is one of the most popular real-time learning strategy for feedforward neural networks with single hidden layer due to its fast learning speed and excellent generalization ability. When dealing with quaternion signals, traditional real-valued learning models usually provide only suboptimal solutions compared with their quaternion-valued counterparts. However, online sequential quaternion extreme learning machine (OS-QELM) model is still lacking in literature. To fill this gap, this paper aims to establish a framework for the derivation and the design of OS-QELM. Specifically, we first derive a standard OS-QELM, and then propose two augmented OS-QELM models which can capture the complete second-order statistics of noncircular quaternion signals. The corresponding regularized models and two approaches to reducing the computational complexity are also derived and discussed respectively. Benefiting from the quaternion algebra and the augmented structure, the proposed models exhibit superiority over OS-ELM in simulation results on several benchmark quaternion regression problems and colour face recognition problems.

This is a preview of subscription content, access via your institution.

Fig. 1
Fig. 2
Fig. 3
Fig. 4
Fig. 5
Fig. 6
Fig. 7


  1. 1.

    Li MB, Huang GB, Zhu QY, Siew CK (2006) Extreme learning machine: theory and applications. Neurocomputing 70:489–501

    Article  Google Scholar 

  2. 2.

    Huang GB, Zhou H, Ding X, Zhang R (2012) Extreme learning machine for regression and multiclass classification. IEEE Trans Syst Man Cybern B Cybern 42(2):513-529

    Article  Google Scholar 

  3. 3.

    Grasso F, Luchetta A, Manetti S (2018) A multi-valued neuron based complex ELM neural network. Neural Process Lett 48(1):389–401

    Article  Google Scholar 

  4. 4.

    Ren Z, Yang L (2019) Robust extreme learning machines with different loss functions. Neural Process Lett 49(3):1543–1565

    Article  Google Scholar 

  5. 5.

    Chang P, Zhang J, Hu J, Song Z (2018) A deep neural network based on ELM for semi-supervised learning of image classification. Neural Process Lett 48(1):375–388

    Article  Google Scholar 

  6. 6.

    Strogatz SH (2001) Nonlinear dynamics and chaos: with applications to physics, biology, chemistry and engineering (studies in nonlinearity). Westview Press, Boulder

    Google Scholar 

  7. 7.

    Wang J, Zhang BJ, Sang ZY, Liu YS, Wu SJ, Miao Q (2020) Convergence of a modified gradient-based learning algorithm with penalty for single-hidden-layer feed-forward networks. Neural Comput Appl 32:2445–2456

    Article  Google Scholar 

  8. 8.

    Zhang HS, Wu W, Liu F, Yao MC (2009) Boundedness and convergence of online gradient method with penalty for feedforward neural networks. IEEE Trans Neural Netw 20(6):1050–1054

    Article  Google Scholar 

  9. 9.

    Sun M, Bai Q (2011) A new descent memory gradient method and its global convergence. J Syst Sci Complex 24(4):784

    MathSciNet  MATH  Article  Google Scholar 

  10. 10.

    Zhang HS, Mandic DP (2016) Is a complex-valued stepsize advantageous in complex-valued gradient learning algorithms? IEEE Trans Neural Netw Learn Syst 27(12):2730–2735

    MathSciNet  Article  Google Scholar 

  11. 11.

    Li F, Zurada JM, Liu Y, Wu W (2017) Input layer regularization of multilayer feedforward neural networks. IEEE Access 5:10979–10985

    Article  Google Scholar 

  12. 12.

    Huang GB, Zhu QY, Siew CK (2015) Trends in extreme learning machines: a review. Neural Netw 61:32–48

    Article  Google Scholar 

  13. 13.

    Yang DK, Li ZX, Wu W (2014) Extreme learning machine for interval neural networks. Neural Comput Appl 27:3–8

    Article  Google Scholar 

  14. 14.

    Qu YP, Shang CJ, Wu W, Shen Q (2011) Evolutionary fuzzy extreme leanrning machine for mammographic risk analysis. Int J Fuzzy Syst 13(4):282–291

    MathSciNet  Google Scholar 

  15. 15.

    Fan YT, Wu W, Yang WY, Fan QW, Wang J (2014) A Pruning algorithm with \(l_{1/2}\) regularizer for extreme learning machine. J Zhejiang Univ (Eng Sci) 15:119–25

    Article  Google Scholar 

  16. 16.

    Yang S, Zhang C, Bao Y, Yang J, Wu W (2020) Binary output layer of extreme learning machine for solving multi-class classification problems. Neural Process Lett 52(1):1–15

    Article  Google Scholar 

  17. 17.

    Yang Y, Wang Y, Yuan X (2013) Parallel chaos search based incremental extreme learning machine. Neural Process Lett 37(3):277–301

    Article  Google Scholar 

  18. 18.

    Liang NY, Huang GB, Saratchandran P, Sundararajan N (2006) A fast and accurate online sequential learning algorithm for feedforward networks. IEEE Trans Neural Netw 17(6):1411–1423

    Article  Google Scholar 

  19. 19.

    Goldman R (2011) Understanding quaternions. Graph Models 73(2):21–49

    Article  Google Scholar 

  20. 20.

    Morais JP, Georgiev S, Sprößig W (2014) Real quaternionic calculus handbook. Springer, Basel

    Google Scholar 

  21. 21.

    Tobar FA, Mandic DP (2014) Quaternion reproducing kernel Hilbert spaces: existence and uniqueness conditions. IEEE Trans Inf Theory 60(9):5736–5749

    MathSciNet  MATH  Article  Google Scholar 

  22. 22.

    Xia YL, Jahanchahi C, Nitta T, Mandic DP (2015) Performance bounds of quaternion estimators. IEEE Trans Neural Netw Learn Syst 26(12):3287–3292

    MathSciNet  Article  Google Scholar 

  23. 23.

    Lian D, Xu L, Jiang H (2005) The properties of similar quaternions and their analytic application. J Nat Sci Nanjing Norm Univ (in Chinese) 2:10–15

    MathSciNet  MATH  Google Scholar 

  24. 24.

    Mitsubori K, Saito T (1994) Torus doubling and hyperchaos in a five dimensional hysteresis circuit. Proc IEEE Int Symp Circ Syst 6:113–116

    Google Scholar 

  25. 25.

    Liu Z, Qiu Y, Peng Y, Pu J, Zhang X (2017) Quaternion based maximum margin criterion method for color face recognition. Neural Process Lett 45(3):913–923

    Article  Google Scholar 

  26. 26.

    Arena P, Fortuna L, Muscato G, Xibilia MG (1998) Neural networks in multidimensional domains. Lecture Notes in Control and Information Sciences 234

  27. 27.

    Arena P, Fortuna L, Muscatoandm G, Xibilia G (1997) Multilayer perceptrons to approximate quaternion valued functions. Neural Netw 10(2):335–342

    Article  Google Scholar 

  28. 28.

    Greenblatt AB, Agaian SS (2018) Introducing quaternion multi-valued neural networks with numerical examples. Inf Sci 423:326–342

    MathSciNet  Article  Google Scholar 

  29. 29.

    Mandic DP, Goh SL (2009) Complex valued nonlinear adaptive filters: noncircularity, widely linear and neural models. Wiley, New York

    Google Scholar 

  30. 30.

    Took CC, Mandic DP (2009) The quaternion LMS algorithm for adaptive filtering of hypercomplex processes. IEEE Trans Signal Process 57(4):1316–1327

    MathSciNet  MATH  Article  Google Scholar 

  31. 31.

    Ujang BC, Took CC, Mandic DP (2011) Quaternion-valued nonlinear adaptive filtering. IEEE Trans Neural Netw 22:1193–1206

    Article  Google Scholar 

  32. 32.

    Jahanchahi C, Mandic DP (2014) A class of quaternion Kalman filters. IEEE Trans Neural Netw Learn Syst 25(3):533–544

    Article  Google Scholar 

  33. 33.

    Xiang M, Kanna S, Mandic DP (2018) Performance analysis of quaternion-valued adaptive filters in nonstationary environments. IEEE Trans Signal Process 66(6):1566–1579

    MathSciNet  MATH  Article  Google Scholar 

  34. 34.

    Xu DP, Mandic DP (2015) The theory of quaternion matrix derivatives. IEEE Trans Signal Process 63(6):1543–1556

    MathSciNet  MATH  Article  Google Scholar 

  35. 35.

    Xu DP, Jahanchahi C, Cheong Took C, Mandic DP (2015) Enabling quaternion derivatives: the generalized HR calculus. R Soc Open Sci 2(8):150–255

    MathSciNet  Article  Google Scholar 

  36. 36.

    Xia YL, Jahanchahi C, Mandic DP (2015) Quaternion-valued echo state networks. IEEE Trans Neural Netw Learn Syst 26(4):663–673

    MathSciNet  Article  Google Scholar 

  37. 37.

    Shi ZJ, Guo J (2009) A new family of conjugate gradient methods. J Comput Appl Math 224(1):444–457

    MathSciNet  MATH  Article  Google Scholar 

  38. 38.

    Mandic DP, Jahanchahi C, Took CC (2011) A quaternion gradient operator and its applications. IEEE Signal Proc Lett 18:47–50

    Article  Google Scholar 

  39. 39.

    Javidi S, Took CC, Mandic DP (2011) Fast independent component analysis algorithm for quaternion valued signals. IEEE Trans. Neural Netw 22(12):1967–1978

    Article  Google Scholar 

  40. 40.

    Zhang HS, Lv H (2019) Augmented quaternion extreme learning machine. IEEE Access 7:90842–90850

    Article  Google Scholar 

  41. 41.

    Cheong Took C, Mandic DP (2011) Augmented second order statistics of quaternion random signals. Signal Process 91(2):214–224

    MATH  Article  Google Scholar 

  42. 42.

    Xia YL, Douglas SC, Mandic DP (2018) A perspective on clms as a deficient length augmented clms: dealing with second order noncircularity. Signal Process 149:236–245

    Article  Google Scholar 

  43. 43.

    Xia YL, Mandic DP (2017) Complementary mean square analysis of augmented CLMS for second-order noncircular Gaussian signals. IEEE Signal Process Lett 24(9):1413–1417

    Google Scholar 

  44. 44.

    Took CC, Mandic DP (2010) A quaternion widely linear adaptive filter. IEEE Trans Signal Process 58(8):4427–4431

    MathSciNet  MATH  Article  Google Scholar 

  45. 45.

    Picinbono B, Chevalier P (1995) Widely linear estimation with complex data. IEEE Trans Signal Process 43(8):2030–2033

    Article  Google Scholar 

  46. 46.

    Zhang HS, Wang YY, Xu DP, Wang J, Xu LH (2018) The augmented complex-valued extreme learning machine. Neurocomputing 311:363–372

    Article  Google Scholar 

  47. 47.

    Via J, Ramirez D, Santamaria I (2010) Properness and widely linear processing of quaternion random vectors. IEEE Trans Inf Theory 56(7):3502–3515

    MathSciNet  MATH  Article  Google Scholar 

  48. 48.

    Xia YL, Mandic DP (2017) Augmented performance bounds on strictly linear and widely linear estimators with complex data. IEEE Trans Signal Process 66(2):507–514

    MathSciNet  MATH  Article  Google Scholar 

  49. 49.

    Xia YL, Tao S, Li Z, Xiang M, Pei W, Mandic DP (2019) Full mean square performance bounds on quaternion estimators for improper data. IEEE Trans Signal Process 67(15):4093–4106

    MathSciNet  MATH  Article  Google Scholar 

  50. 50.

    Ward JP (2012) Quaternions and Cayley numbers: algebra and applications, vol 403. Springer, Berlin

    Google Scholar 

  51. 51.

    Ell TA, Sangwine SJ (2007) Quaternion involutions and anti-involutions. Comput Math Appl 53(1):137–143

    MathSciNet  MATH  Article  Google Scholar 

  52. 52.

    Liu B (2004) The Moore-Penrose generalized inverse of quaternion matrix. J Guilin Univ Electr Tech 24(5):68–71

    Google Scholar 

  53. 53.

    Golub GH, Loan CFV (1996) Matrix computations, 3rd edn. The Johns Hopkins Univ. Press, Baltimore

    Google Scholar 

  54. 54.

    Bartlett L (1998) The sample complexity of pattern classification with neural networks: the size of the weights is more important than the size of the network. IEEE Trans. Inform. Theory 44(2):525–536

    MathSciNet  MATH  Article  Google Scholar 

  55. 55.

    Chong EKP, Zak SH (2001) An introduction to optimization. Wiley, New York

    Google Scholar 

  56. 56.

    Kirkpatrick J, Pascanu R, Rabinowitz N et al (2017) Overcoming catastrophic forgetting in neural networks. Proc Natl Acad Sci 114(13):3521–3526

    MathSciNet  MATH  Article  Google Scholar 

  57. 57.

    Jahanchahi C, Took CC, Mandic DP (2010) The widely linear quaternion recursive least squares filter. In: 2010 2nd international workshop on cognitive information processing. IEEE. 87-92

  58. 58.

    Xiang M, Took CC, Mandic DP (2017) Cost-effective quaternion minimum mean square error estimation: from widely linear to four-channel processing. Signal Process 136:81–91

    Article  Google Scholar 

  59. 59.

    Spacek L (2009) Libor Spacek’s facial images databases [Online]. available:

  60. 60.

    Jaha ES, Ghouti L (2011) Color face recognition using quaternion PCA. In: Proc. 4th Int. Conf. Imag. Crim. Detect. Prev. 1-6

Download references


This work is supported by the National Natural Science Foundation of China (Nos. 61671099, 61301202) and the Fundamental Research Funds for the Central Universities of China.

Author information



Corresponding author

Correspondence to Huisheng Zhang.

Additional information

Publisher's Note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Rights and permissions

Reprints and Permissions

About this article

Verify currency and authenticity via CrossMark

Cite this article

Zhu, S., Wang, H., Lv, H. et al. Augmented Online Sequential Quaternion Extreme Learning Machine. Neural Process Lett (2021).

Download citation


  • Extreme learning machine
  • Online sequential learning
  • Quaternion signal processing
  • Augmented quaternion statistics