Skip to main content

Online Sequential Extreme Learning of Sparse Ridgelet Kernel Regressor for Nonlinear Time-Series Prediction

  • Conference paper
Intelligent Science and Intelligent Data Engineering (IScIDE 2011)

Part of the book series: Lecture Notes in Computer Science ((LNIP,volume 7202))

  • 3559 Accesses

Abstract

In this paper, inspired by Multiscale Geometric Analysis (MGA), a Sparse Ridgelet Kernel Regressor (SRKR) is constructed by combing ridgelet theory with kernel trick. Considering the preferable future of sequential learning over batch learning, we exploit the kernel method in an online setting using the sequential extreme learning scheme to predict nonlinear time-series successively. By using the dimensionality non-separable ridgelet kernels, SRKR is capable of processing the high-dimensional data more efficiently. The online learning algorithm of the examples, named Online Sequential Extreme Learning Algorithm (OS-ELA) is employed to rapidly produce a sequence of estimations. OS-ELA learn the training data one-by-one or chunk by chunk (with fixed or varying size), and discard them as long as the training procedure for those data is completed to keep the memory bounded in online learning. Evolution scheme is also incorporated to obtain a ‘good’ sparse regressor. Experiments are taken on some nonlinear time-series prediction problems, in which the examples are available one by one. Some comparisons are made and the experimental results show its efficiency and superiority to its counterparts.

This work is supported by the National Science Foundation of CHINA under grant no. 61072108, 60971112, 60601029, and the Basic Science Research Fund in Xidian University under Grant no. JY10000902041.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 39.99
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 54.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

References

  1. Lorentz, G.G., Golitschek, M.V., Makovoz, Y.: Constructive Approximation: Advanced Problems. Springer, Heidelberg (1996)

    Book  MATH  Google Scholar 

  2. Härdle, W.: Applied nonparametric regression. Cambridge University Press, Cambridge (1990)

    MATH  Google Scholar 

  3. Bank, J.N., Omitaomu, O.A., Fernandez, S.J., Liu, Y.: Visualization and classification of power system frequency data streams. In: Proc. IEEE ICDM Workshop Spatial Spatiotemporal Data Mining, Miami, FL, December 6, pp. 650–655 (2009)

    Google Scholar 

  4. Aronszajn, N.: Theory of reproducing kernels. Trans. Amer. Math.Soc. 68 (1950)

    Google Scholar 

  5. Aizerman, M.A., Braverman, E.M., Rozonoer, L.I.: The method of potential functions for the problem of restoring the characteristic of a function converter from randomly observed points. Autom. Remote Control 25(12), 1546–1556 (1964)

    MathSciNet  Google Scholar 

  6. Kimeldorf, G., Wahba, G.: Some results on Tchebycheffian spline functions. J. Math. Anal. Appl. 33, 82–95 (1971)

    Article  MathSciNet  MATH  Google Scholar 

  7. Wahba, G.: Spline Models for Observational Data. SIAM, Philadelphia (1990)

    Book  MATH  Google Scholar 

  8. Duttweiler, D.L., Kailath, T.: An RKHS approach to detection and estimation theory: Some parameter estimation problems (Part V). IEEE Trans. Inf. Theory 19(1), 29–37 (1973)

    Article  MathSciNet  MATH  Google Scholar 

  9. Schölkopf, B., Burges, J.C., Smola, A.J.: Advances in Kernel Methods. MIT Press, Cambridge (1999)

    Google Scholar 

  10. Schölkopf, B., Smola, A.J.: Learning With Kernels. MIT Press, Cambridge (2002)

    Google Scholar 

  11. Theodoridis, S., Koutroumbas, K.: Pattern Recognition, 3rd edn. Academic Press, Amsterdam (2006)

    MATH  Google Scholar 

  12. Shawe-Taylor, J., Cristianini, N.: Kernel Methods for Pattern Analysis. Cambridge University Press, Cambridge (2004)

    Book  Google Scholar 

  13. Schölkopf, B., Smola, A.J.: Learning With Kernels. MIT Press, Cambridge (2001)

    Google Scholar 

  14. Sebald, D.J., Bucklew, J.A.: Support vector machine techniques for nonlinear equalization. IEEE Trans. Signal Processing 48, 3217–3226 (2000)

    Article  Google Scholar 

  15. Kivinen, J., Smola, A.J., Williamson, R.C.: Online learning with kernels. IEEE Trans. Signal Process. 52(8), 2165–2176 (2004)

    Article  MathSciNet  Google Scholar 

  16. Sebald, D.J., Bucklew, J.A.: Support vector machine techniques for nonlinear equalization. IEEE Trans. Signal Process. 48(11), 3217–3226 (2000)

    Article  Google Scholar 

  17. Cauwenberghs, G., Poggio, T.: Incremental and decremental support vector machine learning. In: Adv. Neural Inf. Process. Syst. (NIPS), vol. 13, pp. 409–415. MIT Press, Cambridge (2000)

    Google Scholar 

  18. Laskov, P., Gehl, C., Krüger, S., Müller, K.-R.: Incremental support vector learning: Analysis, implementation and applications. J. Mach. Learn. Res. 7, 1909–1936 (2006)

    MathSciNet  MATH  Google Scholar 

  19. Engel, Y., Mannor, S., Meir, R.: The kernel recursive least-squares algorithm. IEEE Trans. Signal Process. 52(8), 2275–2285 (2004)

    Article  MathSciNet  Google Scholar 

  20. Malipatil, A.V., Huang, Y.-F., Andra, S., Bennett, K.: Kernelized set-membership approach to nonlinear adaptive filtering. Proc. IEEE ICASSP IV, 149–152 (2005)

    Google Scholar 

  21. Crammer, K., Kandola, J., Singer, Y.: Online classification on a budget. In: Advances in Neural Information Processing Systems (2003)

    Google Scholar 

  22. Weston, J., Bordes, A., Bottou, L.: Online (and offline) on an even tighter budget. In: Cowell, G., Ghahramani, Z. (eds.) Proc. of AISTATS, pp. 413–420 (2005)

    Google Scholar 

  23. Kivinen, J., Smola, A., Williamson, R.: Online learning with kernels. IEEE Trans. on Signal Processing 52(8), 2165–2176 (2004)

    Article  MathSciNet  Google Scholar 

  24. Cheng, L., Vishwanathan, S.V.N., Schuurmans, D., Wang, S., Caelli, T.: Implicit online learning with kernels. In: Advances in Neural Information Processing Systems, vol. 19, pp. 249–256 (2007)

    Google Scholar 

  25. Dekel, O., Shalev-Shwartz, S., Singer, Y.: The Forgetron: A kernel-based perceptron on a budget. SIAM Journal on Computing 37(5), 1342–1372 (2007)

    Article  MathSciNet  Google Scholar 

  26. Cesa-Bianchi, N., Gentile, C.: Tracking the Best Hyperplane with a Simple Budget Perceptron. In: Lugosi, G., Simon, H.U. (eds.) COLT 2006. LNCS (LNAI), vol. 4005, pp. 483–498. Springer, Heidelberg (2006)

    Chapter  Google Scholar 

  27. Langford, J., Li, L., Zhang, T.: Sparse online learning via truncated gradient. In: Advances in Neural Information Processing Systems, vol. 21, pp. 905–912 (2008)

    Google Scholar 

  28. Ma, J., James, T., Simon, P.: Accurate on-line support vector regression. Neural Comput. 15(11), 2683–2703 (2003)

    Article  MATH  Google Scholar 

  29. Candès, E.J.: Ridgelets and Their Derivatives: Representation of Images with Edges, Curves and Surfaces. In: Schumaker, L.L., et al. (eds.). Vanderbilt University Press, Nashville (1999)

    Google Scholar 

  30. Candès, E.J.: Ridgelets and the Representation of Mutilated Sobolev Functions. SIAM Journal on Mathematical Analysis (2002)

    Google Scholar 

  31. Starck, J.L., Candès, E.J., Donoho, D.L.: The Curvelet Transform for Image Denoising. IEEE Transactions on Image Processing 11(6), 670–684 (2002)

    Article  MathSciNet  Google Scholar 

  32. Starck, J.L., Murtagh, F., Candès, E.J., Donoho, D.L.: Gray and Color Image Contrast Enhancement by the Curvelet Transform. IEEE Transaction on Image Processing 12(6), 706–717 (2003)

    Article  Google Scholar 

  33. Yang, S., Wang, M., Jiao, L.: Ridgelet kernel regression. NeuroComputing 70, 3046–3055 (2007)

    Article  Google Scholar 

  34. Huang, G.-B., Ding, X., Zhou, H.: Optimization Method Based Extreme Learning Machine for Classification. Neurocomputing 74, 155–163 (2010)

    Article  Google Scholar 

  35. Huang, G.-B., Liang, N.-Y., Rong, H.-J., Saratchandran, P., Sundararajan, N.: On-Line Sequential Extreme Learning Machine. In: The IASTED International Conference on Computational Intelligence (CI 2005), Calgary, Canada, July 4-6 (2005)

    Google Scholar 

  36. Rong, H.-J., Huang, G.-B., Saratchandran, P., Sundararajan, N.: On-Line Sequential Fuzzy Extreme Learning Machine for Function Approximation and Classification Problems. IEEE Transactions on Systems, Man, and Cybernetics: Part B 39(4), 1067–1072 (2009)

    Article  Google Scholar 

  37. Lan, Y., Soh, Y.C., Huang, G.-B.: Ensemble of Online Sequential Extreme Learning Machine. Neurocomputing 72, 3391–3395 (2009)

    Article  Google Scholar 

  38. Candès: Ridgelets: estimating with ridge functions, techinal report (2003)

    Google Scholar 

  39. Yang, S., Wang, M., Jiao, L.: Geometrical Muti-resolution Network based on Ridgelet Frame. Signal Processing 87(4), 750–761 (2007)

    Article  MATH  Google Scholar 

  40. Yang, S., Wang, M., Jiao, L.: A Linear Ridgelet Neural Network. NeuroComputing 73(1-3), 468–477 (2009)

    Article  Google Scholar 

  41. Yang, S., Wang, M., Jiao, L.: Incremental Constructive Ridgelet Neural Network. NeuroComputing 72(1-3), 367–377 (2008)

    Article  Google Scholar 

  42. Haykin, S.: Neural networks: a comprehensive foundation, 2nd edn. IEEE Computer Society Press, New York (1999)

    MATH  Google Scholar 

  43. Kimeldorf, G.S., Wahba, G.: Tchebycheffian spline functions. J. Math. Ana. Applic. 33, 82–95 (1971)

    Article  MathSciNet  MATH  Google Scholar 

  44. Clerc, M., Kennedy, J.: The particle swarm: Explosion, stability, and convergence in a multidimensional complex space. IEEE Trans. Evolutionary Computation 6, 58–73 (2002)

    Article  Google Scholar 

  45. IEEE Std 1159-1995, IEEE recommended practice for monitoring electric power quality

    Google Scholar 

  46. Dugan, R.C., McGranaghan, M.F., Santoso, S., Beaty, H.W.: Electrical power system quality. McGraw-Hill, New York (2002)

    Google Scholar 

  47. Kaewarsa, S., et al.: Electrical Power and Energy Systems, vol. 30, pp. 254–260 (2008)

    Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2012 Springer-Verlag Berlin Heidelberg

About this paper

Cite this paper

Yang, S., Zuo, D., Wang, M., Jiao, L. (2012). Online Sequential Extreme Learning of Sparse Ridgelet Kernel Regressor for Nonlinear Time-Series Prediction. In: Zhang, Y., Zhou, ZH., Zhang, C., Li, Y. (eds) Intelligent Science and Intelligent Data Engineering. IScIDE 2011. Lecture Notes in Computer Science, vol 7202. Springer, Berlin, Heidelberg. https://doi.org/10.1007/978-3-642-31919-8_3

Download citation

  • DOI: https://doi.org/10.1007/978-3-642-31919-8_3

  • Publisher Name: Springer, Berlin, Heidelberg

  • Print ISBN: 978-3-642-31918-1

  • Online ISBN: 978-3-642-31919-8

  • eBook Packages: Computer ScienceComputer Science (R0)

Publish with us

Policies and ethics