Modeling Complex Symbolic Sequences with Neural Based Systems
We study the problem of modeling long, complex symbolic sequences with recurrent neural networks (RNNs) and stochastic machines (SMs). RNNs are trained to predict the next symbol and the training process is monitored with information theory based performance measures. SMs are constructed using Kohonen self-organizing map quantizing RNN state space. We compare generative models through entropy spectra computed from sequences, or directly from the machines.
KeywordsRecurrent Neural Network Finite State Machine Symbolic Sequence State Transition Matrix Cross Entropy
Unable to display preview. Download preview PDF.
- J.P. Crutchfield and K. Young. Computation at the onset of chaos. In W.H. Zurek, editor, Complexity, Entropy, and the physics of Information, SFI Studies in the Sciences of Complexity, vol 8, pages 223–269. Addison-Wesley, 1990.Google Scholar
- J. Hertz, A. Krogh, and R.G. Palmer. Introduction to the Theory of Neural Computation. Addison-Wesley, Redwood City, CA, 1991.Google Scholar
- T. Kohonen. The self-organizing map. Proceedings of the IEEE, 78(9):1464–1479, 1990.Google Scholar
- P. Tino and M. Koteles. Modeling complex sequences with neural and hybrid neural based approaches. Technical Report STUFEI-DCSTR-96-49, Slovak University of Technology, Bratislava, Slovakia, September 1996.Google Scholar
- K. Young and J.P. Crutchfield. Fluctuation spectroscopy. In W. Ebeling, editor, Chaos, Solitons, and Fractals, special issue on Complexity, 1993.Google Scholar