Recurrent neural networks are an effective architecture for sequence learning tasks where the data is strongly correlated along a single axis. This axis typically corresponds to time, or in some cases (such as protein secondary structure prediction) one-dimensional space. Some of the properties that make RNNs suitable for sequence learning, such as robustness to input warping and the ability to learn which context to use, are also desirable in domains with more than one spatio-temporal dimension.
KeywordsHide Layer Recurrent Neural Network Convolutional Neural Network Handwritten Digit Forward Pass
Unable to display preview. Download preview PDF.