Skip to main content

New Approach of Recurrent Neural Network Weight Initialization

  • Chapter
  • 1440 Accesses

Part of the book series: Lecture Notes in Electrical Engineering ((LNEE,volume 14))

This paper proposes a weight initialization strategy for a discrete—time re current neural network model. It is based on analyzing the recurrent network as a nonlinear system, and choosing its initial weights to put this system in the bound aries between different dynamics, i.e., its bifurcations. The relationship between the change in dynamics and training error evolution is studied. Two simple examples of the application of this strategy are shown: the identification of DC Induction motor and the detection of a physiological signal, a feature of a visual evoked potential brain signal.

This is a preview of subscription content, log in via an institution.

Buying options

Chapter
USD   29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD   129.00
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD   169.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info
Hardcover Book
USD   169.99
Price excludes VAT (USA)
  • Durable hardcover edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Learn about institutional subscriptions

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

References

  1. C. M. Marcus and R. M. Westervelt, Dynamics of Iterated—Map Neural Networks, Physical Review A 40, 501–504 (1989)

    Article  MathSciNet  Google Scholar 

  2. J, Cao, On Stability of Delayed Cellular Neural Networks, Physics Letters A 261(5–6), 303–308 (1999)

    Article  MATH  MathSciNet  Google Scholar 

  3. P. Tiño, B. G. Horne, and C. L. Giles, Attractive Periodic Sets in Discrete—Time Recurrent Net works (with Emphasis on Fixed—Point Stability and Bifurcations in Two—Neuron Networks), Neural Computation 13(6), 1379–1414 (2001)

    Article  MATH  Google Scholar 

  4. R. Hush and B. G. Horne, Progress in Supervised Neural Networks, IEEE Signal Processing Magazine 10(1), 8–39 (1993)

    Article  Google Scholar 

  5. F. Pasemann, Complex Dynamics and the Structure of Small Neural Networks, Network: Computation in Neural System 13(2), 195–216 (2002)

    MATH  Google Scholar 

  6. F. Pasemann, Synchronous and Asynchronous Chaos in Coupled Neuromodules, International Journal of Bifurcation and Chaos 9(10), 1957–1968 (1999)

    Article  MATH  MathSciNet  Google Scholar 

  7. C. Robinson, Dynamical Systems: Stability, Symbolic Dynamics, and Chaos, Boca Raton, FL: CRC Press (1995)

    MATH  Google Scholar 

  8. J. Hale and H. Koςak, Dynamics and Bifurcations, New York: Springer (1991)

    MATH  Google Scholar 

  9. X. Wang, Discrete-Time Dynamics of Coupled Quasi—Periodic and Chaotic Neural Network Oscillators, International Joint Conference on Neural Networks, Baltimore, Maryland, USA (1992)

    Google Scholar 

  10. Y. A. Kuznetsov, Elements of Applied Bifurcation Theory, 3rd edition, New York: Springer (2004)

    MATH  Google Scholar 

  11. J. D. Piñeiro, R. L. Marichal, L. Moreno, J. Sigut, I. Estévez, R. Aguilar, J. L. Sánchez, and J. Merino, Evoked Potential Feature Detection with Recurrent Dynamic Neural Networks, International ICSC/IFAC Symposium on Neural Computation, Vienna (1998)

    Google Scholar 

  12. K. Ogata, Modern Control Engineering, 4th edition, Englewood Cliffs, NJ: Prentice Hall (2001)

    Google Scholar 

  13. D. Regan, Human Brain Electrophysiology: Evoked Potentials and Evoked Magnetic Fields in Science and Medicine, New York: Elsevier (1989)

    Google Scholar 

  14. G. Thimm and E. Fiesler, High-Order and Multilayer Perceptron Initialization, IEEE Transac tions on Neural Networks 8(2), 349–359 (1997)

    Article  Google Scholar 

  15. R. L. Marichal, J. D. Piñeiro, L. Moreno, E. J. González, and J. Sigut, Bifurcation Analysis on Hopfield Discrete Neural Networks, WSEAS Transaction on System 5, 119–125 (2006)

    MATH  Google Scholar 

  16. Y. Bengio, P. Simard, and P. Frasconi, Learning Long-Term Dependences with Gradient Descent Is Difficult, IEEE Transaction on Neural Networks 5, 157–166 (1994)

    Article  Google Scholar 

  17. R. Fletcher, Practical Methods of Optimization, 2nd edition, Chichester: Wiley (2000)

    Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2009 Springer Science+Business Media B.V

About this chapter

Cite this chapter

Marichal, R., PiÑeiro, J.D., González, E.J., Torres, J.M. (2009). New Approach of Recurrent Neural Network Weight Initialization. In: Ao, SI., Rieger, B., Chen, SS. (eds) Advances in Computational Algorithms and Data Analysis. Lecture Notes in Electrical Engineering, vol 14. Springer, Dordrecht. https://doi.org/10.1007/978-1-4020-8919-0_37

Download citation

  • DOI: https://doi.org/10.1007/978-1-4020-8919-0_37

  • Publisher Name: Springer, Dordrecht

  • Print ISBN: 978-1-4020-8918-3

  • Online ISBN: 978-1-4020-8919-0

  • eBook Packages: Computer ScienceComputer Science (R0)

Publish with us

Policies and ethics