Abstract
We investigate rates of approximation of multivariable functions by one-hidden-layer neural networks with a general hidden unit function. Under mild assumptions on hidden unit function we derive upper bounds on rates of approximation (measured by both the number of hidden units and the size of parameters) in terms of various norms of the function to be approximated and its higher order moduli of continuity.
This work was partially supported by GA AV grant A2030602 and KBN grant 8T11A02311.
This is a preview of subscription content, log in via an institution.
Buying options
Tax calculation will be finalised at checkout
Purchases are for personal use only
Learn about institutional subscriptionsPreview
Unable to display preview. Download preview PDF.
References
A. R. Barron. Neural net approximation. In Proceedings of the 7th Yale Workshop on Adaptive and Learning Systems (pp. 69–72 ), 1992.
A. R. Barron. Universal approximation bounds for superposition of a sigmoidal function. IEEE Transactions on Information Theory, 39 (3): 930–945, 1993.
R. DeVore, R. Howard and C. Micchelli. Optimal nonlinear approximation. Manuscripta Mathematica, 63: 469–478, 1989.
F. Girosi and G. Anzellotti. Rates of convergence for radial basis function and neural networks. In Artificial Neural Networks for Speech and Vision (pp. 97–113 ). Chapman and Hall, London, 1993.
M. Gori, F. Scarselli and A. C. Tsoi. Which classes of functions can a given multilayer perceptron approximate? In Proceedings of the ICNN’96 (pp. 2226–2231), IEEE, 1996.
K. Hlavácková, V. Kůrková and P. Savicky. Representations and rates of approximation of real-valued Boolean functions by neural networks (manuscript).
K. Hornik. Some new results on neural network approximation. Neural Networks, 6: 1069–1072, 1993.
L. K. Jones. A simple lemma on greedy approximation in Hilbert space and convergence rates for projection pursuit regression and neural network training. Annals of Statistics, 20: 608–613, 1992.
V. Kůrková. Approximation of functions by perceptron networks with bounded number of hidden units. Neural Networks, 8: 745–750, 1995.
V. Kůrková. Dimension-independent rates of approximation by neural networks. In Computer-Intensive Methods in Control and Signal Processing: Curse of Dimensionality (pp. 261–270 ), Birkhauser, Boston, 1997.
V. Kůrková . Incremental approximation by neural networks. In Dealing with Complexity: A Neural Network Approach. Springer, London, 1997 (in press)
V. Kůrková. Trade-off between the size of weights and the number of hidden units in feedforward networks. Technical Report ICS-96-495.
V. Kůrková, P.C. Kainen and V. Kreinovich. Estimates of the number of hidden units and variation with respect to half-spaces. Neural Networks, 1997 (in press).
H. N. Mhaskar and C. A. Micchelli. Approximation by superposition of sigmoidal and radial basis functions. Advances in Applied Mathematics, 13: 350–373, 1992.
H. N. Mhaskar and C. A. Micchelli. Dimension-independent bounds on the degree of approximation by neural networks. IBM Journal of Research and Development, 38: 277–284, 1994.
J. Park and I. W. Sandberg. Approximation and radial-basis-function networks. Neural Computation, 5: 305–316, 1993.
Author information
Authors and Affiliations
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 1998 Springer-Verlag London Limited
About this paper
Cite this paper
Kůrková, V. (1998). Rates of Approximation of Multivariable Functions by One-hidden-layer Neural Networks. In: Marinaro, M., Tagliaferri, R. (eds) Neural Nets WIRN VIETRI-97. Perspectives in Neural Computing. Springer, London. https://doi.org/10.1007/978-1-4471-1520-5_9
Download citation
DOI: https://doi.org/10.1007/978-1-4471-1520-5_9
Publisher Name: Springer, London
Print ISBN: 978-1-4471-1522-9
Online ISBN: 978-1-4471-1520-5
eBook Packages: Springer Book Archive