Skip to main content

An extended Elman net for modeling time series

  • Part III: Learning: Theory and Algorithms
  • Conference paper
  • First Online:
Artificial Neural Networks — ICANN'97 (ICANN 1997)

Part of the book series: Lecture Notes in Computer Science ((LNCS,volume 1327))

Included in the following conference series:

Abstract

The prediction and modeling of dynamical systems, for example chaotic time series, with neural networks remains an interesting and challenging research problem. It seems to be rather natural to employ recurrent neural networks for which we will suggest a new structure based on the Elman net [1]. The major difference to neural networks as proposed by Williams and Zipser [2] is the way we organize the time steps. The dynamic of the network and of the input flow is defined in a way as to guarantee that the information at the input node is available at the output node in one time step, irrespective of the connection matrix. We apply the network to the Lorenz and the Rössler system and comment on the problem of evaluating the quality of a network used as a dynamical model.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Institutional subscriptions

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

References

  1. J. Elman. Finding structure in time. Cognitive Science, 14:179–211, 1990.

    Google Scholar 

  2. R. J. Williams. A learning algorithm for continually running fully recurrent neural networks. Neural Computation, 1:270–280, 1989.

    Google Scholar 

  3. K. Hornik, M. Stinchcombe, and H White. Multilayer feedforward networks are universal approximators. Neural Networks, 2:359–366, 1989.

    Google Scholar 

  4. G. Deco and B. Schürmann. Neural learning of chaotic system behavior. IEICE Trans. Fundamentals, E77 A:1840–1845, 1994.

    Google Scholar 

  5. J.C. Principe and J.-M. Kuo. Dynamic modelling of chaotic time series with neural networks. In R.P. Lippmann, J.E. Moody, and D.S. Touretzky, editors, Advances in Neural Information Processing Systems (NIPS) 7. Morgan Kauffman, 1995.

    Google Scholar 

  6. J. Elman. Learning and development in neural networks: the importance of starting small. Cognition, 48:71–99, 1993.

    Google Scholar 

  7. T. Sauer, J.A. Yorke, and M. Casdagli. Embedology. J. Stat. Phys., 65, 1991.

    Google Scholar 

  8. H. Abarbanel, R. Brown, J. Sidorowich, and L. Tsimring. Analysis of observed chaotic data in physical systems. Rev. Mod. Phys., 65:1331–1392, 1993.

    Google Scholar 

  9. E.N. Lorenz. Deterministic nonperiodic flow. J. Athmospheric Sci., 20:130–141, 1963.

    Google Scholar 

  10. A. Wolf, J.B. Swift, H.L. Swinney, and J.A. Vastano. Determining lyapunov exponents from a time series. Physica, D(16):285–317, 1985.

    Google Scholar 

  11. O.E. Rössler. Phys. Letter, 57A, 1976.

    Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Editor information

Wulfram Gerstner Alain Germond Martin Hasler Jean-Daniel Nicoud

Rights and permissions

Reprints and permissions

Copyright information

© 1997 Springer-Verlag Berlin Heidelberg

About this paper

Cite this paper

Stagge, P., Sendhoff, B. (1997). An extended Elman net for modeling time series. In: Gerstner, W., Germond, A., Hasler, M., Nicoud, JD. (eds) Artificial Neural Networks — ICANN'97. ICANN 1997. Lecture Notes in Computer Science, vol 1327. Springer, Berlin, Heidelberg. https://doi.org/10.1007/BFb0020192

Download citation

  • DOI: https://doi.org/10.1007/BFb0020192

  • Published:

  • Publisher Name: Springer, Berlin, Heidelberg

  • Print ISBN: 978-3-540-63631-1

  • Online ISBN: 978-3-540-69620-9

  • eBook Packages: Springer Book Archive

Publish with us

Policies and ethics