On the Comparisons between RLSA and CLA for Solving Arbitrary Linear Simultaneous Equations
This paper compares the performance by using constrained learning algorithm (CLA) and recursive least square algorithm (RLSA) to solve linear simultaneous equations. It was found in experiments that the convergent speed for this CLA is much faster than the recursive least square back propagation (RLS-BP) algorithm. Finally, related experimental results are presented.
Unable to display preview. Download preview PDF.
- 1.Huang, D.S.: Systematic Theory of Neural Network for Pattern Recognition. Publishing House of Electronic Industry, Beijing, China (1996)Google Scholar
- 2.Huang, D.S.: One-Layer Linear Perceptron for the Inversion of Nonsingular Matrix. In: IC On ROVPIA 1996, Ipoh, Malaysia, November 28 30, pp. 639–643 (1996)Google Scholar
- 3.Huang, D.S., Zhao, M.S.: A Neural Network Based Factorization Model for Polynomials in Several Elements. In: 2000 5th International Conference on Signal Processing Proceedings (WCC2000-ICSP2000), August 21–25, pp. 1617–1622. Beijing, China (2000)Google Scholar
- 4.Huang, D.S.: Application of neural networks to finding real roots of polynomials in one element. In: ICONIP-2000 Proceedings, Taejon, Korea, November 14-17, vol. II, pp. 1108–1113 (2000)Google Scholar
- 7.Huang, D.S., Lv, X.X., Yuan, K.P.: A Study of Backpropagation Learning Algorithm of Multilayer Perceptron Networks Based on Recursive Least Squares. In: Proc. of the 6th Japan-China International Conference on Computer Applications, Sapporo, Japan, September 16-18, vol. (J-4), pp. 169–172 (1994)Google Scholar
- 9.Huang, D.S., Chi, Z.: Finding Complex Roots of Polynomials by Feedforward Neural Networks. In: 2001 Int. Joint Conf. On Neural Networks (IJCNN2001), Washington, DC, July 15-19, pp. 13–18 (2001)Google Scholar