Abstract
In previous chapters, we showed that the computational power of our neural networks depends on the type of numbers utilized as weights. Neural networks with rational weights, just like Turing machines, are finite objects, in the sense that they can be described with a finite amount of information. This is not true for networks with real weights; these have access to a potentially infinite source of information, which may allow them to compute nonrecursive functions. This chapter proves the intuitive notion that as the real numbers used grow richer in information, more functions become computable. To formalize this statement, we need a measure by which to quantify the information contained in real numbers.
This is a preview of subscription content, log in via an institution.
Buying options
Tax calculation will be finalised at checkout
Purchases are for personal use only
Learn about institutional subscriptionsPreview
Unable to display preview. Download preview PDF.
Author information
Authors and Affiliations
Rights and permissions
Copyright information
© 1999 Springer Science+Business Media New York
About this chapter
Cite this chapter
Siegelmann, H.T. (1999). Kolmogorov Weights: Between P and P/poly. In: Neural Networks and Analog Computation. Progress in Theoretical Computer Science. Birkhäuser, Boston, MA. https://doi.org/10.1007/978-1-4612-0707-8_5
Download citation
DOI: https://doi.org/10.1007/978-1-4612-0707-8_5
Publisher Name: Birkhäuser, Boston, MA
Print ISBN: 978-1-4612-6875-8
Online ISBN: 978-1-4612-0707-8
eBook Packages: Springer Book Archive