Skip to main content

FNN (Feedforward Neural Network) Training Method Based on Robust Recursive Least Square Method

  • Conference paper
  • 1820 Accesses

Part of the book series: Lecture Notes in Computer Science ((LNTCS,volume 4492))

Abstract

We present a robust recursive least squares algorithm for multilayer feed-forward neural network training. So far, recursive least squares (RLS) has been successfully applied to training multilayer feed-forward neural networks. However, RLS method has a tendency to become diverse due to the instability in the recursive inversion procedure. In this paper, we propose a numerically robust recursive least square type algorithm using prewhitening. The proposed algorithm improves the performance of RLS in infinite numerical precision as well as in finite numerical precision. The computer simulation results in the various precision cases show that the proposed algorithm improves the numerical robustness of RLS training.

This is a preview of subscription content, log in via an institution.

Buying options

Chapter
USD   29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD   129.00
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD   169.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Learn about institutional subscriptions

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

References

  1. Wong, K.W., Leung, C.S.: On-line Successive Synthesis of Wavelet Networks. Neural Processing Letter 7, 91–100 (1998)

    Article  Google Scholar 

  2. Leung, C.S., Tsoi, A.C.: Two-regularizers for Recursive Least Squared Algorithms in Feedforward Multilayered Neural Networks. IEEE Trans. Neural Networks 12, 1314–1332 (2001)

    Article  Google Scholar 

  3. Haykin, S.: Adaptive Filter Theory, 3rd edn. Prentice-Hall, Upper Saddle River (1996)

    Google Scholar 

  4. Douglas, S.C.: Numerically - Robust. O(N2) Recursive Least-Squares Estimation Using Least Squares Prewhitening. In: Proceeding of International Conference of Acoustics, Speech, and Signal Processing (ICASSP00), vol. 1, pp. 412–415 (2000)

    Google Scholar 

  5. Dasilva, F.M., Almeida, L.B.: A Distributed Decorrelation Algorithm. In: Gelenbe, E. (ed.) Neural Networks: Advances and Applications, pp. 145–163. Elsevier Science, Amsterdam (1991)

    Google Scholar 

  6. Douglas, S.C., Cichocki, A.: Neural Networks for Blind Decorrelation of Signals. IEEE Trans. Signal Processing 45, 2829–2842 (1997)

    Article  Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Editor information

Derong Liu Shumin Fei Zengguang Hou Huaguang Zhang Changyin Sun

Rights and permissions

Reprints and permissions

Copyright information

© 2007 Springer Berlin Heidelberg

About this paper

Cite this paper

Lim, J., Sung, K. (2007). FNN (Feedforward Neural Network) Training Method Based on Robust Recursive Least Square Method. In: Liu, D., Fei, S., Hou, Z., Zhang, H., Sun, C. (eds) Advances in Neural Networks – ISNN 2007. ISNN 2007. Lecture Notes in Computer Science, vol 4492. Springer, Berlin, Heidelberg. https://doi.org/10.1007/978-3-540-72393-6_48

Download citation

  • DOI: https://doi.org/10.1007/978-3-540-72393-6_48

  • Publisher Name: Springer, Berlin, Heidelberg

  • Print ISBN: 978-3-540-72392-9

  • Online ISBN: 978-3-540-72393-6

  • eBook Packages: Computer ScienceComputer Science (R0)

Publish with us

Policies and ethics