Advertisement

Bounds on the Prediction Error of Penalized Least Squares Estimators with Convex Penalty

  • Pierre Bellec
  • Alexandre TsybakovEmail author
Conference paper
Part of the Springer Proceedings in Mathematics & Statistics book series (PROMS, volume 208)

Abstract

This paper considers the penalized least squares estimator with arbitrary convex penalty. When the observation noise is Gaussian, we show that the prediction error is a subgaussian random variable concentrated around its median. We apply this concentration property to derive sharp oracle inequalities for the prediction error of the LASSO, the group LASSO, and the SLOPE estimators, both in probability and in expectation. In contrast to the previous work on the LASSO-type methods, our oracle inequalities in probability are obtained at any confidence level for estimators with tuning parameters that do not depend on the confidence level. This is also the reason why we are able to establish sparsity oracle bounds in expectation for the LASSO-type estimators, while the previously known techniques did not allow for the control of the expected risk. In addition, we show that the concentration rate in the oracle inequalities is better than it was commonly understood before.

Keywords

Penalized least squares Oracle inequality LASSO estimator SLOPE estimator Group LASSO 

Notes

Acknowledgements

This work was supported by GENES and by the French National Research Agency (ANR) under the grants IPANEMA (ANR-13-BSH1-0004-02) and Labex Ecodec (ANR-11-LABEX-0047). It was also supported by the “Chaire Economie et Gestion des Nouvelles Donné es”, under the auspices of Institut Louis Bachelier, Havas-Media and Paris-Dauphine.

References

  1. 1.
    Alekseev, V.M., Tikhomirov, V.M., Fomin, S.V.: Optimal Control. Consultants Bureau, New York (1987)CrossRefzbMATHGoogle Scholar
  2. 2.
    Bellec, P.C., Lecué, G., Tsybakov, A.B.: Slope Meets Lasso: Improved Oracle Bounds and Optimality (2016). arXiv:1605.08651
  3. 3.
    Bickel, P.J., Ritov, Y., Tsybakov, A.B.: Simultaneous analysis of Lasso and Dantzig selector. Ann. Stat. 37(4), 1705–1732 (2009)MathSciNetCrossRefzbMATHGoogle Scholar
  4. 4.
    Bogdan, M., van den Berg, E., Sabatti, C., Su, W., Candès, E.J.: SLOPE-adaptive variable selection via convex optimization. Ann. Appl. Stat. 9(3), 1103–1140 (2015)MathSciNetCrossRefzbMATHGoogle Scholar
  5. 5.
    Bühlmann, P., van de Geer, S.: Statistics for High-dimensional Data: Methods, Theory and Applications. Springer, Berlin (2011)CrossRefzbMATHGoogle Scholar
  6. 6.
    Dalalyan, A.S., Hebiri, M., Lederer, J.: On the Prediction Performance of the Lasso (2014). arXiv:1402.1700
  7. 7.
    Giraud. C.: Introduction to High-dimensional Statistics, vol. 138. CRC Press, Boca Raton (2014)Google Scholar
  8. 8.
    Hiriart-Urruty, J.-B., Lemaréchal, C.: Convex Analysis and Minimization Algorithms I: Fundamentals. Springer, Berlin (1993)Google Scholar
  9. 9.
    Koltchinskii, V., Lounici, K., Tsybakov, A.B.: Nuclear-norm penalization and optimal rates for noisy low-rank matrix completion. Ann. Stat. 39(5), 2302–2329 (2011)MathSciNetCrossRefzbMATHGoogle Scholar
  10. 10.
    Lifshits, M.: Lectures on Gaussian Processes. Springer, Berlin (2012)Google Scholar
  11. 11.
    Lounici, K., Pontil, M., Tsybakov, A.B., van de Geer, S.: Oracle inequalities and optimal inference under group sparsity. Ann. Stat. 39, 2164–2204 (2011)MathSciNetCrossRefzbMATHGoogle Scholar
  12. 12.
    Maurer, A., Pontil, M.: Structured sparsity and generalization. J. Mach. Learn. Res. 13, 671–690 (2012)MathSciNetzbMATHGoogle Scholar
  13. 13.
    Micchelli, C.A., Morales, J.M., Pontil, M.: A family of penalty functions for structured sparsity. Adv. Neural. Inf. Process. Syst. NIPS 23, 2010 (2010)Google Scholar
  14. 14.
    Peypouquet, J.: Convex Optimization in Normed Spaces: Theory, Methods and Examples. Springer, Berlin (2015)CrossRefzbMATHGoogle Scholar
  15. 15.
    Tibshirani, R.J., Taylor, J.: Degrees of freedom in lasso problems. Ann. Stat. 40(2), 1198–1232 (2012)MathSciNetCrossRefzbMATHGoogle Scholar
  16. 16.
    van de Geer, S.: Estimation and Testing under Sparsity. Springer, Berlin (2016)CrossRefzbMATHGoogle Scholar
  17. 17.
    van de Geer, S., Wainwright, M.: On Concentration for (Regularized) Empirical Risk Minimization (2015). arXiv:1512.00677

Copyright information

© Springer International Publishing AG 2017

Authors and Affiliations

  1. 1.Rutgers UniversityPiscatawayUSA
  2. 2.ENSAE ParisTechMalakoff CedexFrance

Personalised recommendations