Skip to main content

A Data-Reusing Gradient Descent Algorithm for Complex-Valued Recurrent Neural Networks

  • Conference paper
Book cover Knowledge-Based Intelligent Information and Engineering Systems (KES 2003)

Part of the book series: Lecture Notes in Computer Science ((LNAI,volume 2774))

  • 968 Accesses

Abstract

A class of data-reusing (DR) learning algorithms for real-valued recurrent neural networks (RNNs) employed as nonlinear adaptive filters is extended to the complex domain to give a class of data-reusing learning algorithms for complex valued recurrent neural networks (CRNNs). For rigour, the derivation of the data-reusing complex real time recurrent learning (DRCRTRL) algorithm is undertaken for a general complex activation function. The analysis provides both error bounds and convergence conditions for the case of contractive and expansive complex activation functions. The improved performance of the data–reusing algorithm over the standard one is verified by simulations on prediction of complex valued signals.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 39.99
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 54.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

References

  1. Mandic, D.P., Chambers, J.: Recurrent Neural Networks for Prediction: Learning Algorithms, Architectures and Stability. John Wiley & Sons, Chichester (2001)

    Book  Google Scholar 

  2. Kim, T., Adali, T.: Complex Backpropagation Neural Network Using Elementary Transcendental Activation Functions. In: Proceedings of the International Conference on Acoustics, Speech, and Signal Processing, ICASSP, vol. 2, pp. 1281–1284 (2001)

    Google Scholar 

  3. Widrow, B., McCool, J., Ball, M.: The Complex LMS Algorithm. Proceedings of the IEEE 63, 712–720 (1975)

    Article  Google Scholar 

  4. Narendra, K.S., Parthasarathy, K.: Identification and Control of Dynamical Systems Using Neural Networks. Proceedings of the IEEE 1(1), 4–27 (1990)

    Google Scholar 

  5. Kechriotis, G., Manolakos, E.S.: Training Fully Recurrent Neural Networks with Complex Weights. IEEE Transactions on Circuits and Systems-II: Analog and Digital Processing 41(3), 235–238 (1994)

    Google Scholar 

  6. Hirose, A.: Continous Complex-Valued Backpropagation Learning. Electronics Letters 3(9), 2101–2104 (1991)

    Google Scholar 

  7. Georgiou, G.M., Koutsougeras, C.: Complex Domain Backpropagation. IEEE Transactions on Circuits and Systems-II: Analog and Digital Processing 39(5), 330–334 (1991)

    Google Scholar 

  8. Mandic, D.P.: Data-reusing recurrent neural adaptive filters. Neural Computation 14(11), 2693–2708 (2002)

    Article  MATH  Google Scholar 

  9. Mandic, D.P., Chambers, J.A.: A posteriori real time recurrent learning schemes for a recurrent neural network based non-linear predictor. IEE Proceedings–Vision, Image and Signal Processing 144(6), 365–370 (1998)

    Article  Google Scholar 

  10. Connor, J.T., Martin, R.D., Atlas, L.E.: Recurrent neural networks and robust time series prediction. IEEE Transactions on Neural Networks 5(2), 240–254 (1994)

    Article  Google Scholar 

  11. Nerrand, O., Roussel-Ragot, P., Personnaz, L., Dreyfus, G.: Neural networks and nonlinear adaptive filtering: Unifying concepts and new algorithms. Neural Computation 14(11), 165–199 (1993)

    Article  Google Scholar 

  12. Roy, S., Shynk, J.J.: Analysis of the data–reusing LMS algorithm. In: Proceedings of the 32nd Midwest Symposium on Circuits and Systems, vol. 2, pp. 1127–1130 (1989)

    Google Scholar 

  13. Treichler, J.R., Johnson Jr., C.R., Larimore, M.G.: Theory and design of adaptive filters. Wiley, Chichester (1994)

    Google Scholar 

  14. Hanna, A.I., Mandic, D.P.: A Data-Reusing Nonlinear Gradient Descent Algorithm for a Class of Complex-Valued Neural Adaptive Filters. Neural Processing Letters 17, 1–7 (2002)

    Google Scholar 

  15. Benvenuto, N., Piazza, F.: On The Complex Backpropagation Algorithm. IEEE Transactions on Signal Processing 40(4), 967–969 (1992)

    Article  Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2003 Springer-Verlag Berlin Heidelberg

About this paper

Cite this paper

Goh, S.L., Mandic, D.P. (2003). A Data-Reusing Gradient Descent Algorithm for Complex-Valued Recurrent Neural Networks. In: Palade, V., Howlett, R.J., Jain, L. (eds) Knowledge-Based Intelligent Information and Engineering Systems. KES 2003. Lecture Notes in Computer Science(), vol 2774. Springer, Berlin, Heidelberg. https://doi.org/10.1007/978-3-540-45226-3_47

Download citation

  • DOI: https://doi.org/10.1007/978-3-540-45226-3_47

  • Publisher Name: Springer, Berlin, Heidelberg

  • Print ISBN: 978-3-540-40804-8

  • Online ISBN: 978-3-540-45226-3

  • eBook Packages: Springer Book Archive

Publish with us

Policies and ethics