Abstract
A class of data-reusing (DR) learning algorithms for real-valued recurrent neural networks (RNNs) employed as nonlinear adaptive filters is extended to the complex domain to give a class of data-reusing learning algorithms for complex valued recurrent neural networks (CRNNs). For rigour, the derivation of the data-reusing complex real time recurrent learning (DRCRTRL) algorithm is undertaken for a general complex activation function. The analysis provides both error bounds and convergence conditions for the case of contractive and expansive complex activation functions. The improved performance of the data–reusing algorithm over the standard one is verified by simulations on prediction of complex valued signals.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
Preview
Unable to display preview. Download preview PDF.
References
Mandic, D.P., Chambers, J.: Recurrent Neural Networks for Prediction: Learning Algorithms, Architectures and Stability. John Wiley & Sons, Chichester (2001)
Kim, T., Adali, T.: Complex Backpropagation Neural Network Using Elementary Transcendental Activation Functions. In: Proceedings of the International Conference on Acoustics, Speech, and Signal Processing, ICASSP, vol. 2, pp. 1281–1284 (2001)
Widrow, B., McCool, J., Ball, M.: The Complex LMS Algorithm. Proceedings of the IEEE 63, 712–720 (1975)
Narendra, K.S., Parthasarathy, K.: Identification and Control of Dynamical Systems Using Neural Networks. Proceedings of the IEEE 1(1), 4–27 (1990)
Kechriotis, G., Manolakos, E.S.: Training Fully Recurrent Neural Networks with Complex Weights. IEEE Transactions on Circuits and Systems-II: Analog and Digital Processing 41(3), 235–238 (1994)
Hirose, A.: Continous Complex-Valued Backpropagation Learning. Electronics Letters 3(9), 2101–2104 (1991)
Georgiou, G.M., Koutsougeras, C.: Complex Domain Backpropagation. IEEE Transactions on Circuits and Systems-II: Analog and Digital Processing 39(5), 330–334 (1991)
Mandic, D.P.: Data-reusing recurrent neural adaptive filters. Neural Computation 14(11), 2693–2708 (2002)
Mandic, D.P., Chambers, J.A.: A posteriori real time recurrent learning schemes for a recurrent neural network based non-linear predictor. IEE Proceedings–Vision, Image and Signal Processing 144(6), 365–370 (1998)
Connor, J.T., Martin, R.D., Atlas, L.E.: Recurrent neural networks and robust time series prediction. IEEE Transactions on Neural Networks 5(2), 240–254 (1994)
Nerrand, O., Roussel-Ragot, P., Personnaz, L., Dreyfus, G.: Neural networks and nonlinear adaptive filtering: Unifying concepts and new algorithms. Neural Computation 14(11), 165–199 (1993)
Roy, S., Shynk, J.J.: Analysis of the data–reusing LMS algorithm. In: Proceedings of the 32nd Midwest Symposium on Circuits and Systems, vol. 2, pp. 1127–1130 (1989)
Treichler, J.R., Johnson Jr., C.R., Larimore, M.G.: Theory and design of adaptive filters. Wiley, Chichester (1994)
Hanna, A.I., Mandic, D.P.: A Data-Reusing Nonlinear Gradient Descent Algorithm for a Class of Complex-Valued Neural Adaptive Filters. Neural Processing Letters 17, 1–7 (2002)
Benvenuto, N., Piazza, F.: On The Complex Backpropagation Algorithm. IEEE Transactions on Signal Processing 40(4), 967–969 (1992)
Author information
Authors and Affiliations
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2003 Springer-Verlag Berlin Heidelberg
About this paper
Cite this paper
Goh, S.L., Mandic, D.P. (2003). A Data-Reusing Gradient Descent Algorithm for Complex-Valued Recurrent Neural Networks. In: Palade, V., Howlett, R.J., Jain, L. (eds) Knowledge-Based Intelligent Information and Engineering Systems. KES 2003. Lecture Notes in Computer Science(), vol 2774. Springer, Berlin, Heidelberg. https://doi.org/10.1007/978-3-540-45226-3_47
Download citation
DOI: https://doi.org/10.1007/978-3-540-45226-3_47
Publisher Name: Springer, Berlin, Heidelberg
Print ISBN: 978-3-540-40804-8
Online ISBN: 978-3-540-45226-3
eBook Packages: Springer Book Archive