Abstract
In this paper a recurrent network, which consists of O(√m log m) RBF (radial basis functions)units with maximum norm employing any activation function that has different values in at least two nonnegative points, is constructed so as to implement a given deterministic finite automaton with m states the underlying simulation proves to be robust with respect to analog noise for a large class of smooth activation functions with a special type of inflexion.
Research supported by GA AS CR Grant B2030007.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
Preview
Unable to display preview. Download preview PDF.
References
Alon, N., Dewdney, A. K., Ott,. J.Efficient simulation of finite automata by neural nets. Journal of the ACM 38 495–514, 1991. 433
Balcázar, J.L., Gavaldá, R., Siegelmann, H.T. Computational power of neural networks: A characterization in terms of Kolmogorov complexity. IEEE Transactions of Information Theory 43 1175–1183, 1997. 431
Broomhead, D.S., Lowe, D. Multivariable functional interpolation and adaptive networks.Complex Systems 2 321–355, 1988. 431
Das, S., Mozer, M.C. A unified gradient-descent/clustering architecture for finite state machine induction. In J. Cowan, G. Tesauro, and J. Alspector, editors, Neural Information Processing Systems 6 19–26, 1994. 432
Frasconi, P., Gori, M., Maggini, M., Soda, G. A unified approach for integrating explicit knowledge and learning by example in recurrent networks. In Proceedings of the IEEE International Joint Conference on Neural Networks IJCNN’91 Seattle, vol.I 881–916, IEEE Press, New York, 1991. 432
Frasconi, P., Gori, M., Maggini, M., Soda, G. Representation of finite state automata in recurrent radial basis function networks. Machine Learning 23 5–32, 1996. 432
Giles, C.L., Miller, C.B., Chen, D., Chen, H.H., Sun, G.Z., Lee, Y. C. Learning and extracting finite state automata with second-order recurrent neural networks. Neural Computation 4 393–405,1992. 432
Gori, M., Maggini, M., Soda, G. Inductive inference with recurrent radial basis function networks. In Proceedings of the International Conference on Artificial Neural Networks ICANN’94 Sorrento, Italy, 238–241, Springer-Verlag, 1994. 432
Haykin, S. Neural Networks: A Comprehensive Foundation. Prentice-Hall, Upper Saddle River, NJ, 2nd edition, 1999. 431
Horne, B. G., Hush, D. R. On the node complexity of neural networks. Neural Networks 7 1413–1426, 1994. 434
Horne, B. G., Hush, D. R. Bounds on the complexity of recurrent neural network implementations of finite state machines. Neural Networks 9 243–252, 1996. 432
Indyk, P. Optimal simulation of automata by neural nets. In Proceedings of the Twelfth Annual Symposium on Theoretical Aspects of Computer Science STACS’95 vol.900 of LNCS, 337–348, Springer-Verlag, Berlin, 1995. 431, 432, 433, 437
Kilian, J., Siegelmann, H.T. The dynamic universality of sigmoidal neural networks. Information and Computation 128 48–56, 1996. 437
Kleene, S.C. Representation of Events in Nerve Nets and Finite Automata. In C.E. Shannon and J. McCarthy, editors, Automata Studies vol.34 of Annals of Mathematics Studies 3–41, Princeton University Press, NJ, 1956. 431
Maass, W., Orponen, P. On the effect of analog noise in discrete-time analog computations. Neural Computation 10 1071–1095, 1998. 432, 437
Manolios, P., Fanelli, R. First-order recurrent neural networks and deterministic finite state automata. Neural Computation 6 1155–1173, 1994. 432
Minsky, M.L., Papert, S.A. Perceptrons. MI Press, Cambridge, MA, 1969. 431
Moody, J.E., Darken, C.J. Fast learning in networks of locally-tuned processing units. Neural Computation 1 281–294, 1989. 431
Omlin, C.W., Giles, C.L. Training second-order recurrent neural networks using hints. In D. Sleeman and P. Edwards, editors, Proceedings of the Ninth International Conference on Machine Learning 363–368, San Mateo, CA, Morgan Kaufman Publishers, 1992. 432
Omlin, C.W., Giles, C.L. Constructing deterministic finite-state automata in recurrent neural networks. Journal of the ACM 43 937–972, 1996. 432, 437
Poggio, T., Girosi, F. Networks for approximation and learning. In Proceedings of the IEEE 78 1481–1497, 1990. 431
Powell, M. J. D. Radial basis functions for multivariable interpolation:A review. In J.C. Mason and M.G. Cox, editors, Proceedings of the IMA Conference on Algorithms for the Approximation of Functions and Data RMCS, Shrivenham, UK, 143–167, Oxford Science Publications, 1985. 431
Renals, S. Radial basis function network for speech pattern classification. Electronics Letters 25 437–439, 1989. 431
Siegelmann, H.T., Sontag, E.D. Computational power of neural networks. Journal of Computer System Science 50 132–150, 1995. 431, 433, 437
Šíma, J. Analog stable simulation of discrete neural networks. Neural Network World 7 679–686, 1997. 432, 437
Šíma, J., Wiedermann, J. Theory of neuromata.Journal of the ACM 45 155–178, 1998. 432, 433
Tiňo, P., Šajda, J. Learning and extracting initial mealy automata with a modular neural network model. Neural Computation 7 822–844, 1995. 432
Zeng, Z., Goodman, R., Smyth, P. Learning finite state machines with selfclustering recurrent networks, Neural Computation 5 976–990, 1993. 432
Author information
Authors and Affiliations
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2000 Springer-Verlag Berlin Heidelberg
About this paper
Cite this paper
Šorel, M., Šíma, J. (2000). Robust Implementation of Finite Automata by Recurrent RBF Networks. In: Hlaváč, V., Jeffery, K.G., Wiedermann, J. (eds) SOFSEM 2000: Theory and Practice of Informatics. SOFSEM 2000. Lecture Notes in Computer Science, vol 1963. Springer, Berlin, Heidelberg. https://doi.org/10.1007/3-540-44411-4_32
Download citation
DOI: https://doi.org/10.1007/3-540-44411-4_32
Published:
Publisher Name: Springer, Berlin, Heidelberg
Print ISBN: 978-3-540-41348-6
Online ISBN: 978-3-540-44411-4
eBook Packages: Springer Book Archive