Multivariate adaptive regression spline (MARS) denotes a modern methodology from statistical learning which is important in both classification and regression. It is very useful for high-dimensional problems and shows a great promise for fitting nonlinear multivariate functions by using its ability to estimate the contributions of the basis functions so that both the additive and the interactive effects of the predictors are allowed to determine the response variable. The MARS algorithm for estimating the model function consists of two sub-algorithms. In our paper, we propose not to use second algorithm. Instead, we construct a penalized residual sum of squares (PRSS) for MARS as a higher-order Tikhonov regularization problem which is also known as ridge regression that shrinks coefficients and make them more stable. But it cannot perform variable selection in the model and, hence, does not give an easily interpretable model (especially, if the number of variable p is large). For this reason, we change the Tikhonov penalty function with the generalized Lasso penalty for solving the problem PRSS, taking an advantage for feature selection. We treat this problem using continuous optimization techniques which we consider to become an important complementary technology and model-based alternative to the concept of the backward stepwise algorithm. In particular, we apply the elegant framework of conic quadratic programming (CQP), and we call the solution as CG-Lasso. Here, we gain from an area of convex optimization whose programs are very well-structured, herewith, resembling linear programming and, hence, permitting the use of powerful interior point methods (IPMs).
Multivariate adaptive regression spline Ridge regression Continuous optimization Conic quadratic programming Interior point methods Inverse problems Exterior point methods Regularization Singular value decomposition Convex problems
This is a preview of subscription content, log in to check access.
Breiman, L., Friedman, J.H., Olshen, R., Stone, C.: Classification and Regression Trees. Wadsworth, Belmont (1984)zbMATHGoogle Scholar
Hansen, P.C.: Rank-Deficient and Discrete Ill-Posed Problems: Numerical Aspects of Line-ear Inversion. SIAM, Philadelphia (1998)CrossRefGoogle Scholar
Tibshirani, R.: Regression shrinkage and selection via the Lasso. J. R. Stat. Soc. Ser. B Stat. Methodol. 58, 267–288 (1996)MathSciNetzbMATHGoogle Scholar
Taylan, P., Weber, G.W., Yerlikaya, F.: A new approach to multivariate adaptive regression splines by using Tikhonov regularization and continuous optimization. J. TOP. 18(2), 377–395 (2010)MathSciNetCrossRefGoogle Scholar
Craven, P., Wahba, G.: Smoothing noisy data with spline functions: estimating the correct degree of smoothing by the method of generalized cross-validation. Numer. Math. 31, 377–403 (1979)MathSciNetCrossRefGoogle Scholar
Weber, G.W., Batmaz, I., Köksal, G., et al.: CMARS: a new contribution to nonparametric regression with multivariate adaptive regression splines supported by continuous optimization. Inverse Probl. Sci. Eng. 20(3), 371–400 (2012)MathSciNetCrossRefGoogle Scholar
Hastie, T., Tibshirani, R., Friedman, J.H.: The Element of Statistical Learning. Springer, New York (2001)CrossRefGoogle Scholar
Blumschein, P., Hung, W., Jonassen, D.: Model-Based Approaches to Learning: Using Systems Models and Simulations to Improve Understanding and problem Solving in Complex Domains. Sense Publishers, Rotterdam (2009)Google Scholar
Taylan, P., Weber, G.W., Beck, A.: New approaches to regression by generalized additive models and continuous optimization for modern applications in finance, science and technology. Optimization. 56(5–6), 675–698 (2007)MathSciNetCrossRefGoogle Scholar
Işcanoğlu, A., Weber, G.W., Taylan, P.: Predicting Default Probabilities with Generalized Additive Models for Emerging Markets Graduate Summer School on Recent Advances in Statistics. METU, Ankara (2007)Google Scholar