Abstract
Estimating a non-uniformly sampled function from a set of learning points is a classical regression problem. Kernel methods have been widely used in this context, but every problem leads to two major tasks: optimizing the kernel and setting the fitness-regularization compromise.
This article presents a new method to estimate a function from noisy learning points in the context of RKHS (Reproducing Kernel Hilbert Space). We introduce the Kernel Basis Pursuit algorithm, which enables us to build a ℓ1-regularized-multiple-kernel estimator. The general idea is to decompose the function to learn on a sparse-optimal set of spanning functions. Our implementation relies on the Least Absolute Shrinkage and Selection Operator (LASSO) formulation and on the Least Angle Regression (LARS) solver. The computation of the full regularization path, through the LARS, will enable us to propose new adaptive criteria to find an optimal fitness-regularization compromise. Finally, we aim at proposing a fast parameter-free method to estimate non-uniform-sampled functions.
This work was supported in part by the IST Program of the European Community, under the PASCAL Network of Excellence, IST-2002-506778. This publication only reflects the authors’ views.
Chapter PDF
Similar content being viewed by others
References
Tikhonov, A., Arsénin, V.: Solutions of ill-posed problems. W.H. Winston (1977)
Girosi, F., Jones, M., Poggio, T.: Regularization theory and neural networks architectures. Neural Computation 7, 219–269 (1995)
Wahba, G.: Spline Models for Observational Data. Series in Applied Mathematics, vol. 59. SIAM, Philadelphia (1990)
Kimeldorf, G., Wahba, G.: Some results on Tchebycheffian spline functions. J. Math. Anal. Applic. 33, 82–95 (1971)
Tibshirani, R.: Regression shrinkage and selection via the lasso. J. Royal. Statist. 58, 267–288 (1996)
Efron, B., Hastie, T., Johnstone, I., Tibshirani, R.: Least angle regression. Annals of statistics 32, 407–499 (2004)
Bach, F., Thibaux, R., Jordan, M.: Computing regularization paths for learning multiple kernels. In: Neural Information Processing Systems, vol. 17 (2004)
Mallat, S., Zhang, Z.: Matching pursuits with time-frequency dictionaries. IEEE Transactions on Signal Processing 41, 3397–3415 (1993)
Pati, Y.C., Rezaiifar, R., Krishnaprasad, P.S.: Orthogonal matching pursuits: recursive function approximation with applications to wavelet decomposition. In: 27th Asilomar Conference in Signals, Systems, and Computers (1993)
Vincent, P., Bengio, Y.: Kernel matching pursuit. Machine Learning Journal 48, 165–187 (2002)
Chen, S., Donoho, D., Saunders, M.: Atomic decomposition by basis pursuit. SIAM Journal on Scientific Computing 20, 33–61 (1998)
Chen, S.: Basis Pursuit. PhD thesis, Department of Statistics, Stanford University (1995)
Grandvalet, Y.: Least absolute shrinkage is equivalent to quadratic penalization. In: ICANN, pp. 201–206 (1998)
Loosli, G., Canu, S., Vishwanathan, S., Smola, A.J., Chattopadhyay, M.: Une boîte á outils rapide et simple pour les svm. In: CAp (2004)
Ljung, L.: System Identification - Theory for the User (1987)
Schölkopf, B., Smola, A.: Learning with kernels (2002)
Bi, J., Bennett, K., Embrechts, M., Breneman, C., Song, M.: Dimensionality reduction via sparse support vector machines. Journal of Machine Learning Research 3, 1229–1243 (2003)
Donoho, D., Johnstone, I.: Ideal spatial adaptation by wavelet shrinkage. Biometrika 81, 425–455 (1994)
Chang, M., Lin, C.: Leave-one-out bounds for support vector regression model selection. Neural Computation (2005)
Blake, C., Merz, C.: UCI rep. of machine learning databases (1998)
Author information
Authors and Affiliations
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2005 Springer-Verlag Berlin Heidelberg
About this paper
Cite this paper
Guigue, V., Rakotomamonjy, A., Canu, S. (2005). Kernel Basis Pursuit. In: Gama, J., Camacho, R., Brazdil, P.B., Jorge, A.M., Torgo, L. (eds) Machine Learning: ECML 2005. ECML 2005. Lecture Notes in Computer Science(), vol 3720. Springer, Berlin, Heidelberg. https://doi.org/10.1007/11564096_18
Download citation
DOI: https://doi.org/10.1007/11564096_18
Publisher Name: Springer, Berlin, Heidelberg
Print ISBN: 978-3-540-29243-2
Online ISBN: 978-3-540-31692-3
eBook Packages: Computer ScienceComputer Science (R0)