Skip to main content

Kernel Networks with Fixed and Variable Widths

  • Conference paper
Adaptive and Natural Computing Algorithms (ICANNGA 2011)

Part of the book series: Lecture Notes in Computer Science ((LNTCS,volume 6593))

Included in the following conference series:

Abstract

The role of width in kernel models and radial-basis function networks is investigated with a special emphasis on the Gaussian case. Quantitative bounds are given on kernel-based regularization showing the effect of changing the width. These bounds are shown to be d-th powers of width ratios, and so they are exponential in the dimension of input data.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 39.99
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 54.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

References

  1. Fine, T.L.: Feedforward Neural Network Methodology. Springer, Heidelberg (1999)

    MATH  Google Scholar 

  2. Kecman, V.: Learning and Soft Computing. MIT Press, Cambridge (2001)

    MATH  Google Scholar 

  3. Park, J., Sandberg, I.: Universal approximation using radial–basis–function networks. Neural Computation 3, 246–257 (1991)

    Article  Google Scholar 

  4. Park, J., Sandberg, I.: Approximation and radial basis function networks. Neural Computation 5, 305–316 (1993)

    Article  Google Scholar 

  5. Mhaskar, H.N.: Versatile Gaussian networks. In: Proceedings of IEEE Workshop of Nonlinear Image Processing, pp. 70–73 (1995)

    Google Scholar 

  6. Kainen, P.C., Kůrková, V., Sanguineti, M.: Complexity of Gaussian radial basis networks approximating smooth functions. J. of Complexity 25, 63–74 (2009)

    Article  MathSciNet  MATH  Google Scholar 

  7. Cucker, F., Smale, S.: On the mathematical foundations of learning. Bulletin of AMS 39, 1–49 (2002)

    Article  MathSciNet  MATH  Google Scholar 

  8. Poggio, T., Smale, S.: The mathematics of learning: dealing with data. Notices of AMS 50, 537–544 (2003)

    MathSciNet  MATH  Google Scholar 

  9. Kůrková, V.: Neural network learning as an inverse problem. Logic Journal of IGPL 13, 551–559 (2005)

    Article  MathSciNet  Google Scholar 

  10. Gribonval, R., Vandergheynst, P.: On the exponential convergence of matching pursuits in quasi-incoherent dictionaries. IEEE Trans. on Information Theory 52, 255–261 (2006)

    Article  MathSciNet  MATH  Google Scholar 

  11. Aronszajn, N.: Theory of reproducing kernels. Transactions of AMS 68, 337–404 (1950)

    Article  MathSciNet  MATH  Google Scholar 

  12. Strichartz, R.: A Guide to Distribution Theory and Fourier Transforms. World Scientific, NJ (2003)

    Book  MATH  Google Scholar 

  13. Loustau, S.: Aggregation of SVM classifiers using Sobolev spaces. Journal of Machine Learning Research 9, 1559–1582 (2008)

    MathSciNet  MATH  Google Scholar 

  14. Girosi, F.: An equivalence between sparse approximation and support vector machines. Neural Computation (AI memo 1606) 10, 1455–1480 (1998)

    Article  Google Scholar 

  15. Girosi, F., Poggio, T.: Regularization algorithms for learning that are equivalent to multilayer networks. Science 247(4945), 978–982 (1990)

    Article  MathSciNet  MATH  Google Scholar 

  16. Girosi, F., Jones, M., Poggio, T.: Regularization theory and neural networks architectures. Neural Computation 7, 219–269 (1995)

    Article  Google Scholar 

  17. Kůrková, V.: Learning from data as an inverse problem in reproducing kernel Hilbert spaces. Inverse Problems in Science and Engineering (2010) (submitted)

    Google Scholar 

  18. Wahba, G.: Splines Models for Observational Data. SIAM, Philadelphia (1990)

    Book  MATH  Google Scholar 

  19. Friedman, A.: Modern Analysis. Dover, New York (1982)

    MATH  Google Scholar 

  20. Kůrková, V., Neruda, R.: Uniqueness of functional representations by Gaussian basis function networks. In: Proceedings of ICANN 1994, pp. 471–474. Springer, London (1994)

    Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2011 Springer-Verlag Berlin Heidelberg

About this paper

Cite this paper

Kůrková, V., Kainen, P.C. (2011). Kernel Networks with Fixed and Variable Widths. In: Dobnikar, A., Lotrič, U., Šter, B. (eds) Adaptive and Natural Computing Algorithms. ICANNGA 2011. Lecture Notes in Computer Science, vol 6593. Springer, Berlin, Heidelberg. https://doi.org/10.1007/978-3-642-20282-7_2

Download citation

  • DOI: https://doi.org/10.1007/978-3-642-20282-7_2

  • Publisher Name: Springer, Berlin, Heidelberg

  • Print ISBN: 978-3-642-20281-0

  • Online ISBN: 978-3-642-20282-7

  • eBook Packages: Computer ScienceComputer Science (R0)

Publish with us

Policies and ethics