Abstract
In this chapter, we will explore and discuss the basic architecture of sequence models (Recurrent Neural Networks). In particular, we will
-
Build and train sequence models, and a commonly used variant known as Long Short-Term Memory Models (LSTMs).
-
Apply sequence models to Natural Language Processing (NLP) problems, including text synthesis.
This is the last chapter of the book and, the reader is expected to have a very good understanding of neural networks, including convolutional networks.
Machine intelligence is the last invention that humanity will ever need to make.
Nick Bostrom
Author information
Authors and Affiliations
Corresponding author
Rights and permissions
Copyright information
© 2019 Springer Nature Singapore Pte Ltd.
About this chapter
Cite this chapter
Ghatak, A. (2019). Recurrent Neural Networks (RNN) or Sequence Models. In: Deep Learning with R. Springer, Singapore. https://doi.org/10.1007/978-981-13-5850-0_8
Download citation
DOI: https://doi.org/10.1007/978-981-13-5850-0_8
Published:
Publisher Name: Springer, Singapore
Print ISBN: 978-981-13-5849-4
Online ISBN: 978-981-13-5850-0
eBook Packages: Computer ScienceComputer Science (R0)