Abstract
This paper deals with finite size recurrent neural networks which consist of general (possibly cyclic) interconnections of evolving processors. Each neuron may assume a real activation value in a bounded range. We provide the first rigorous foundations for recurrent networks which are built of probabilistic unreliable analog devices and present randomness in their update.
The first model we consider is probabilistic networks, i.e., deterministic networks augmented by probabilistic binary gates of fixed probabilities. The second model incorporates unreliable devices (either neurons or the connections between them), corresponding to the random-noise philosophy of Shannon. The third model is a nondeterministic version of the second one, where nondeterminism is defined on the fault probabilities. We express the computational power of the above models, and see that they are polynomially equivalent, in particular, P f =NP f .
Preview
Unable to display preview. Download preview PDF.
References
L. Adelman. Two theorems on random polynomial time. In IEEE Sympos. on Foundations of Computer Science, volume 19, pages 75–83, New-York, 1978.
J.L. Balcázar, J. DÃaz, and J. Gabarró. Structural Complexity, volume I and II. Springer-Verlag EATCS Monographs, Berlin, 1988–1990.
C. G. Bennet and J. Gill. Relative to a random oracle a, p a ≠np a ≠co-np a with probability 1. SIAM J. on Computing, 10:96–113, 1981.
J. L. Balcázar, R. Gavaldà , H.T. Siegelmann, and E. D. Sontag. Some structural complexity aspects of neural computation. In IEEE Structure in Complexity Theory Conference, pages 253–265, San Diego, CA, May 1993.
J. L. Balcázar, M. Hermo, and E. Mayordomo. Characterizations of logarithmic advice complexity classes. Information Processing 92, IFIP Transactions A-12, 1:315–321, 1992.
L. Blum, M. Shub, and S. Smale. On a theory of computation and complexity over the real numbers: Np completeness, recursive functions, and universal machines. Bull. A.M.S., 21:1–46, 1989.
R.L. Dobrushin and S.I. Ortyukov. Lower bound for the redundancy of selfcorrecting arrangement of unreliable functional elements. Problems info, Transmission, 13:59–65, 1977.
R.L. Dobrushin and S.I. Ortyukov. Upper bound for the redundancy of selfcorrecting arrangement of unreliable functional elements. Problems info, Transmission, 13:346–353, 1977.
J. Kilian and H.T. Siegelmann. On the power of sigmoid neural networks. In Proc. Sixth ACM Workshop on Computational Learning Theory, Santa Cruz, July 1993.
A. A. Muchnik and S. G. Gindikin. The completeness of a system made up of non-reliable elements realizing a function of algebraic logic. Soviet Phys. Dokl, 7:477–479, 1962.
P. Orponen. Neural networks and complexity theory. In Proc. 17th Symposium on Mathematical Foundations of Computer Science, pages 50–61, 1992.
S.I. Ortyukov. Synthesis of asymptotically nonredundant self-correcting arrangements of unreliable functional elements. Problems Inform. Tranmission, 13:247–251, 1978.
I. Parberry. Knowledge, understanding, and computational complexity. Technical Report CRPDC-92-2, Department of Computer Sciences, University of North Texas, Feb 1992.
A. Paz. Introduction to Probabilistic Automata. Academic Press, New York, 1971.
N. Pippenger. Reliable computation by formulae in the presence of noise. IEEE Trans. Inform. Theory, 34:194–197, 1988.
N. Pippenger. Invariance of complexity measure ofr networks with unreliable gates. J. ACM, 36:531–539, 1989.
N. Pippenger. Developments in: The synthesis of reliable organisms from unreliable components. In Proc. of symposia in pure mathematics, volume 5, pages 311–324, 1990.
C. E. Shannon. A mathematical theory of communication. Bell System Tech J., pages 379–423, 623–656, 1948.
H. T. Siegelmann. On the computational power of faulty and asynchronous neural networks. Technical report, Bar-Ilan university, 1993.
H. T. Siegelmann and E. D. Sontag. Turing computability with neural nets. Appl. Math. Lett., 4(6):77–80, 1991.
H. T. Siegelmann and E. D. Sontag. On computational power of neural networks. J. Comp. Syst. Sci, to appear. Previous version appeared in Proc. Fifth ACM Workshop on Computational Learning Theory, pages 440–449, Pittsburgh, July 1992.
H. T. Siegelmann and E. D. Sontag. Analog computation via neural networks. Theoretical Computer Science, to appear. A preliminary version in: The second Israel Symposium on Theory of Computing and Systems, Natanya, Israel, June, 1993.
D. Ulig. On the synthesis of self-correcting schemes from functional elements with a small numer of reliable elements. Math. Notes. Acad. Sci. USSR, 15:558–562, 1974.
J. von Neumann. Probabilistic, logics and the synthesis of reliable organisms from unreliable components. In C.E. Shannon and J. McCarth, editors, Automata Studies. Princeton U. Press, Princeton, NJ, 1956.
N. Wiener. Extrapolation, interpolation, and smoothing of stationary time series. MIT Press, Cambridge, MA, 1949.
Author information
Authors and Affiliations
Editor information
Rights and permissions
Copyright information
© 1994 Springer-Verlag Berlin Heidelberg
About this paper
Cite this paper
Siegelmann, H.T. (1994). On the computational power of probabilistic and faulty neural networks. In: Abiteboul, S., Shamir, E. (eds) Automata, Languages and Programming. ICALP 1994. Lecture Notes in Computer Science, vol 820. Springer, Berlin, Heidelberg. https://doi.org/10.1007/3-540-58201-0_55
Download citation
DOI: https://doi.org/10.1007/3-540-58201-0_55
Published:
Publisher Name: Springer, Berlin, Heidelberg
Print ISBN: 978-3-540-58201-4
Online ISBN: 978-3-540-48566-7
eBook Packages: Springer Book Archive