Advertisement

Complexity and Applications of the Homotopy Principle for Uniformly Constrained Sparse Minimization

  • Christoph BrauerEmail author
  • Dirk A. Lorenz
Article
  • 25 Downloads

Abstract

In this paper, we investigate the homotopy path related to \(\ell _{1}\)-norm minimization problems with \(\ell _{\infty }\)-norm constraints. We establish an enhanced upper bound on the number of linear segments in the path and provide an example showing that the number of segments is exponential in the number of variables in the worst case. We also use the homotopy framework to develop grid independent (cross-)validation schemes for sparse linear discriminant analysis and classification that make use of the entire path. Several numerical and statistical examples illustrate the applicability of the framework.

Keywords

Convex optimization Nonsmooth optimization Homotopy methods Primal-dual methods Binary classification Cross-validation 

Mathematics Subject Classification

65C60 62H30 90C05 90C25 65K05 

Notes

References

  1. 1.
    Appa, G., Smith, C.: On \(L_{1}\) and Chebyshev estimation. Math. Program. 5, 73–87 (1973)CrossRefzbMATHGoogle Scholar
  2. 2.
    Asif, M.S., Romberg, J.: Dantzig selector homotopy with dynamic measurements. In: Proc. SPIE, vol. 72460E (2009)Google Scholar
  3. 3.
    Brauer, C.: Homotopy Methods for Linear Optimization Problems with Sparsity Penalty and Applications. Ph.D. thesis, Technische Universität Carolo-Wilhelmina zu Braunschweig (2018)Google Scholar
  4. 4.
    Brauer, C., Gerkmann, T., Lorenz, D.: Sparse reconstruction of quantized speech signals. In: 2016 IEEE International Conference on Acoustics, Speech and Signal Processing (ICASSP), pp. 5940–5944. IEEE (2016)Google Scholar
  5. 5.
    Brauer, C., Lorenz, D.A., Tillmann, A.M.: A primal-dual homotopy algorithm for \(\ell _1\)-minimization with \(\ell _\infty \)-constraints. Comput. Optim. Appl. 70(2), 443–478 (2018)MathSciNetCrossRefzbMATHGoogle Scholar
  6. 6.
    Cai, T., Liu, W.: A direct estimation approach to sparse linear discriminant analysis. J. Am. Stat. Assoc. 106(496), 1566–1577 (2011)MathSciNetCrossRefzbMATHGoogle Scholar
  7. 7.
    Cai, T., Liu, W., Luo, X.: A constrained \(\ell _1\) minimization approach to sparse precision matrix estimation. J. Am. Stat. Assoc. 106(494), 594–607 (2011)CrossRefzbMATHGoogle Scholar
  8. 8.
    Candès, E., Tao, T.: Decoding by linear programming. IEEE Trans. Inf. Theory 51(12), 4203–4215 (2005)MathSciNetCrossRefzbMATHGoogle Scholar
  9. 9.
    Candès, E., Tao, T.: The Dantzig selector: statistical estimation when \(p\) is much larger than \(n\). Ann. Stat. 35(6), 2313–2351 (2007)MathSciNetCrossRefzbMATHGoogle Scholar
  10. 10.
    Chen, S.S., Donoho, D.L., Saunders, M.A.: Atomic decomposition by basis pursuit. SIAM J. Sci. Comput. 20(1), 33–61 (1998)MathSciNetCrossRefzbMATHGoogle Scholar
  11. 11.
    Donoho, D.L.: Compressed sensing. IEEE Trans. Inf. Theory 52(4), 1289–1306 (2006)MathSciNetCrossRefzbMATHGoogle Scholar
  12. 12.
    Donoho, D.L.: For most large underdetermined systems of linear equations the minimal \(\ell _1\)-norm Solution is also the sparsest solution. Commun. Pure Appl. Math. 59(6), 797–829 (2006)MathSciNetCrossRefzbMATHGoogle Scholar
  13. 13.
    Efron, B., Hastie, T., Johnstone, I., Tibshirani, R., et al.: Least angle regression. Ann. Stat. 32(2), 407–499 (2004)MathSciNetCrossRefzbMATHGoogle Scholar
  14. 14.
    Eldar, Y.C., Kutinyok, G. (eds.): Compressed Sensing: Theory and Applications. Cambridge University Press, Cambridge (2012)Google Scholar
  15. 15.
    Fisher, R.A.: The use of multiple measurements in taxonomic problems. Ann. Hum. Genet. 7(2), 179–188 (1936)Google Scholar
  16. 16.
    Foucart, S., Rauhut, H.: A Mathematical Introduction to Compressive Sensing. Birkhäuser, Boston (2013)CrossRefzbMATHGoogle Scholar
  17. 17.
    James, G.M., Radchenko, P., Lv, J.: DASSO: connections between the Dantzig selector and lasso. J. R. Stat. Soc B 71(1), 127–142 (2009)MathSciNetCrossRefzbMATHGoogle Scholar
  18. 18.
    Kohavi, R.: A study of cross-validation and bootstrap for accuracy estimation and model selection. In: International Joint Conference on Artificial Intelligence (IJCAI) (1995)Google Scholar
  19. 19.
    Lichman, M.: UCI Machine Learning Repository. http://archive.ics.uci.edu/ml (2013)
  20. 20.
    Lorenz, D.A., Pfetsch, M.E., Tillmann, A.M.: Solving basis pursuit: heuristic optimality check and solver comparison. ACM Trans. Math. Softw. 41(2), 8 (2015)MathSciNetCrossRefzbMATHGoogle Scholar
  21. 21.
    Mairal, J., Yu, B.: Complexity analysis of the lasso regularization path. In: Proceedings of the 29th International Conference on Machine Learning (ICML-12), pp. 353–360 (2012)Google Scholar
  22. 22.
    Natarajan, B.K.: Sparse approximate solution to linear systems. SIAM J. Comput. 24(2), 227–234 (1995)MathSciNetCrossRefzbMATHGoogle Scholar
  23. 23.
    Osborne, M.R., Presnell, B., T, B.A.: A new approach to variable selection in least squares problems. IMA J. Numer. Anal. 20(3), 389–404 (2000)MathSciNetCrossRefzbMATHGoogle Scholar
  24. 24.
    Rockafellar, R.T.: Convex Analysis. Princeton University Press, Princeton (1972)Google Scholar
  25. 25.
    Stiefel, E.: Über diskrete und lineare tschebyscheff-approximationen. Numer. Math. 1(1), 1–28 (1959)MathSciNetCrossRefzbMATHGoogle Scholar
  26. 26.
    Tibshirani, R.: Regression shrinkage and selection via the Lasso. J. R. Stat. Soc. 58(1), 267–288 (1996)MathSciNetzbMATHGoogle Scholar

Copyright information

© Springer Science+Business Media, LLC, part of Springer Nature 2019

Authors and Affiliations

  1. 1.TU BraunschweigBraunschweigGermany

Personalised recommendations