Abstract
We propose a general method for estimating the distance between a compact subspace K of the space L 1([0,1]s) of Lebesgue measurable functions defined on the hypercube [0,1]s, and the class of functions computed by artificial neural networks using a single hidden layer, each unit evaluating a sigmoidal activation function. Our lower bounds are stated in terms of an invariant that measures the oscillations of functions of the space K around the origin. As an application we estimate the minimal number of neurons required to approximate bounded functions satisfying uniform Lipschitz conditions of order α with accuracy ε.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
Preview
Unable to display preview. Download preview PDF.
References
Barron, R.: Universal approximation bounds for superposition of a sigmoidal function. IEEE Trans. Information Theory 39, 930–945 (1993)
Barron, A., Barron, A.L.: Statistical learning networks: a unified view. In: Symp. on the Interface: Statistics and Computing Science, Reston, Virginia (1988)
Broomhead, D.S., Lowe, D.: Multivariable functional interpolation and adaptative networks. Complex Systems 2, 321–355 (1988)
Cybenko, G.: Approximation by superposition of sigmoidal functions. Mathematics of Control, signal and Systems 2, 303–314 (1989)
Girosi, F., Jones, M., Poggio, T.: Regularization theory and neuronal networks arquitectures. Neural Computation 7, 219–269 (1995)
Mhaskar, H.N.: Neural networks for optimal approximation of smooth and analytic functions. Neural Computation 8, 164–167 (1996)
Moody, J., Darken, C.: Fast learning in networks of locally tuned processing units. Neural Computation 1(2), 282–294 (1989)
Park, J., Sandberg, I.w.: Universal approximation using radial basis function networks. Neural Computation 3(2), 246–257 (1991)
Poggio, T., Girosi, F., Jones, M.: From regularization to radial tensor and additive splines. In: Proc. Neural Networks for signal Processing, vol. III, pp. 3–10. IEEE, New York (1993)
Leshno, M., Lin, V., Pinkus, A., Schocken, S.: Multilayer feedforward networks with a nonpolynomial activation function can approximate any function. Neuronal Networks 2, 359–366 (1989)
Hornink, K., Stinchcombe, M., White, H.: Multilayer feedforward networks are universal approximators. Neuronal Networks 2, 359–366 (1989)
Devore, R., Howard, R., Micchelli, C.A.: Optimal non-linear approximation. Manuscripta Mathematica 63, 469–478 (1989)
Gabrielov, A.N., Vorobjov, N.: Complexity of computations with Pfaffian and Noetherian functions. Normal Forms, Bifurcations and Finiteness Problems in Differential Equations. Kluwer, Dordrecht (2004)
Khovanskii, A.: On a class of systems of transcendental equations. Soviet Math. Dokl. 22, 762–765 (1980)
Author information
Authors and Affiliations
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2009 Springer-Verlag Berlin Heidelberg
About this paper
Cite this paper
Montaña, J.L., Borges, C.E. (2009). Lower Bounds for Approximation of Some Classes of Lebesgue Measurable Functions by Sigmoidal Neural Networks. In: Cabestany, J., Sandoval, F., Prieto, A., Corchado, J.M. (eds) Bio-Inspired Systems: Computational and Ambient Intelligence. IWANN 2009. Lecture Notes in Computer Science, vol 5517. Springer, Berlin, Heidelberg. https://doi.org/10.1007/978-3-642-02478-8_1
Download citation
DOI: https://doi.org/10.1007/978-3-642-02478-8_1
Publisher Name: Springer, Berlin, Heidelberg
Print ISBN: 978-3-642-02477-1
Online ISBN: 978-3-642-02478-8
eBook Packages: Computer ScienceComputer Science (R0)