Skip to main content

Recurrent Neural Networks (RNN) or Sequence Models

  • Chapter
  • First Online:
Deep Learning with R

Abstract

In this chapter, we will explore and discuss the basic architecture of sequence models (Recurrent Neural Networks). In particular, we will

  • Build and train sequence models, and a commonly used variant known as Long Short-Term Memory Models (LSTMs).

  • Apply sequence models to Natural Language Processing (NLP) problems, including text synthesis.

This is the last chapter of the book and, the reader is expected to have a very good understanding of neural networks, including convolutional networks.

Machine intelligence is the last invention that humanity will ever need to make.

Nick Bostrom

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Institutional subscriptions

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Abhijit Ghatak .

Rights and permissions

Reprints and permissions

Copyright information

© 2019 Springer Nature Singapore Pte Ltd.

About this chapter

Check for updates. Verify currency and authenticity via CrossMark

Cite this chapter

Ghatak, A. (2019). Recurrent Neural Networks (RNN) or Sequence Models. In: Deep Learning with R. Springer, Singapore. https://doi.org/10.1007/978-981-13-5850-0_8

Download citation

  • DOI: https://doi.org/10.1007/978-981-13-5850-0_8

  • Published:

  • Publisher Name: Springer, Singapore

  • Print ISBN: 978-981-13-5849-4

  • Online ISBN: 978-981-13-5850-0

  • eBook Packages: Computer ScienceComputer Science (R0)

Publish with us

Policies and ethics