Abstract
In Numerical Analysis one often has to conclude that an error function is small everywhere if it is small on a large discrete point set and if there is a bound on a derivative. Sampling inequalities put this onto a solid mathematical basis.
A stability inequality is similar, but holds only on a finite–dimensional space of trial functions. It allows bounding a trial function by a norm on a sufficiently fine data sample, without any bound on a high derivative.
This survey first describes these two types of inequalities in general and shows how to derive a stability inequality from a sampling inequality plus an inverse inequality on a finite–dimensional trial space. Then the state–of–the–art in sampling inequalities is reviewed, and new extensions involving functions of infinite smoothness and sampling operators using weak data are presented.
Finally, typical applications of sampling and stability inequalities for recovery of functions from scattered weak or strong data are surveyed. These include Support Vector Machines and unsymmetric methods for solving partial differential equations.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
Preview
Unable to display preview. Download preview PDF.
References
Agadzhanov, A.: Functional properties of Sobolev spaces of infinite order. Soviet. Math. Dokl. 38, 88–92 (1989)
Arcangéli, R., López de Silanes, M., Torrens, J.: An extension of a bound for functions in Sobolev spaces, with applications to (m, s)-spline interpolation and smoothing. Numer. Math. 107, 181–211 (2007)
Arcangéli, R., López de Silanes, M., Torrens, J.: Estimates for functions in Sobolev spaces defined on unbounded domains. To appear in Journal of Approximation Theory (2009), doi:10.1016/j.jat.2008.09.001
Atluri, S., Shen, S.: The meshless local Petrov-Galerkin (MLPG) method: A simple and less-costly alternative to the Finite Element and Boundary Element methods. Computer Modeling in Engineering and Sciences 3, 11–51 (2002)
Atluri, S., Shen, S.: The meshless local Petrov-Galerkin (MLPG) method. Tech Science Press, Encino (2002)
Borwein, P., Erdelyi, T.: Polynomials and Polynomial Inequalities. Springer, New York (1995)
Brenner, S.C., Scott, L.R.: The mathematical theory of finite element methods. Texts in Applied Mathematics, vol. 15. Springer, New York (1994)
Caponnetto, A., DeVito, E.: Optimal rates for the regularized least-squares algorithm. Foundations of Computational Mathematics 7, 331–368 (2007)
Caponnetto, A., DeVito, E.: Learning and approximation by Gaussians on Riemannian manifolds. Advances in Computational Mathematics 29, 291–310 (2008)
Corrigan, A., Wallin, J., Wanner, T.: A sampling inequality for fractional order Sobolev semi-norms using arbitrary order data. Preprint, available online via arXiv:0801.4097v2
Cucker, F., Smale, S.: On the mathematical foundations of Learning. Bulletin of the AMS 39, 1–49 (2001)
Cucker, F., Zhou, D.-X.: Learning Theory: An Approximation Theory Viewpoint. Cambridge University Press, Cambridge (2007)
De Marchi, S., Schaback, R.: Stability of kernel-based interpolation. To appear in Advances in Computational Mathematics (2008), doi:10.1007/s10444-008-9093-4
Duchon, J.: Sur l’erreur d’interpolation des fonctions de plusieurs variables pas les D m–splines. Rev. Française Automat. Informat. Rech. Opér. Anal. Numer. 12, 325–334 (1978)
Girosi, F.: An Equivalence Between Sparse Approximation and Support Vector Machines. Neural Computation 10, 1455–1480 (1998)
Haroske, D.D., Triebel, H.: Distributions, Sobolev spaces, elliptic equations. EMS Textbooks in Mathematics. European Mathematical Society (EMS), Zürich (2008)
Jetter, K., Stöckler, J., Ward, J.D.: Norming sets and scattered data approximation on spheres. In: Approximation Theory IX. Computational Aspects, vol. II, pp. 137–144 (1998)
Jetter, K., Stöckler, J., Ward, J.D.: Error estimates for scattered data interpolation on spheres. Mathematics of Computation 68, 733–747 (1999)
Madych, W.R.: An estimate for multivariate interpolation II. Journal of Approximation Theory 142, 116–128 (2006)
Micchelli, C.A., Pontil, M.: Learning the kernel function via regularization. Journal of Machine Learning Research 6, 1099–1125 (2005)
Narcowich, F.J., Ward, J.D., Wendland, H.: Sobolev bounds on functions with scattered zeros, with applications to radial basis function surface fitting. Mathematics of Computation 74, 743–763 (2005)
Poggio, T., Smale, S.: The Mathematics of Learning: Dealing with Data. Notices of the AMS 50, 537–544 (2003)
Raju, I., Phillips, D., Krishnamurthy, T.: A radial basis function approach in the meshless local Petrov-Galerkin method for Euler-Bernoulli beam problems. Computational Mechanics 34, 464–474 (2004)
Rieger, C.: Sampling Inequalities and Applications. PhD thesis, Universität Göttingen (2008)
Rieger, C., Zwicknagl, B.: Deterministic error analysis of kernel-based regression and related kernel based algorithms. To appear in Journal of Machine Learning Research (2009)
Rieger, C., Zwicknagl, B.: Sampling inequalities for infinitely smooth functions, with applications to interpolation and machine learning. To appear in Advances in Computational Mathematics (2009), doi:10.1007/s10444-008-9089-0
Schaback, R.: Convergence of Unsymmetric Kernel-Based Meshless Collocation Methods. SIAM J. Numer. Anal. 45, 333–351 (2007)
Schaback, R.: Recovery of functions from weak data using unsymmetric meshless kernel-based methods. Applied Numerical Mathematics 58, 726–741 (2007)
Schaback, R., Wendland, H.: Inverse and saturation theorems for radial basis function interpolation. Math. Comp. 71, 669–681 (2002)
Schaback, R., Wendland, H.: Kernel techniques: from machine learning to meshless methods. Acta Numerica 15, 543–639 (2006)
Schölkopf, B., Smola, A.: Learning with Kernels. MIT Press, Cambridge (2002)
Schölkopf, B., Williamson, R.C., Bartlett, P.L.: New Support Vector Algorithms. Neural Computation 12, 1207–1245 (2000)
Smale, S., Zhou, D.: Estimating the approximation error in learning theory. Analysis and Applications 1, 1–25 (2003)
Vapnik, V.: The nature of statistical learning theory. Springer, New York (1995)
Wahba, G.: Spline Models for Observational Data. CBMS-NSF Regional Conference Series in Applied Mathematics, vol. 59. SIAM, Philadelphia (1990)
Wendland, H.: On the convergence of a general class of finite volume methods. SIAM Journal of Numerical Analysis 43, 987–1002 (2005)
Wendland, H.: Scattered Data Approximation. Cambridge Monographs on Applied and Computational Mathematics. Cambridge University Press, Cambridge (2005)
Wendland, H., Rieger, C.: Approximate interpolation with applications to selecting smoothing parameters. Numer. Math. 101, 643–662 (2005)
Wu, Z.M.: Hermite–Birkhoff interpolation of scattered data by radial basis functions. Approximation Theory and its Applications 8, 1–10 (1992)
Zhang, Y.: Convergence of meshless Petrov-Galerkin method using radial basis functions. Applied Mathematics and Computation 183, 307–321 (2006)
Zwicknagl, B.: Power series kernels. Constructive Approximation 29, 61–84 (2009)
Author information
Authors and Affiliations
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2010 Springer-Verlag Berlin Heidelberg
About this paper
Cite this paper
Rieger, C., Schaback, R., Zwicknagl, B. (2010). Sampling and Stability. In: Dæhlen, M., Floater, M., Lyche, T., Merrien, JL., Mørken, K., Schumaker, L.L. (eds) Mathematical Methods for Curves and Surfaces. MMCS 2008. Lecture Notes in Computer Science, vol 5862. Springer, Berlin, Heidelberg. https://doi.org/10.1007/978-3-642-11620-9_23
Download citation
DOI: https://doi.org/10.1007/978-3-642-11620-9_23
Publisher Name: Springer, Berlin, Heidelberg
Print ISBN: 978-3-642-11619-3
Online ISBN: 978-3-642-11620-9
eBook Packages: Computer ScienceComputer Science (R0)