Skip to main content

Duality Gap Analysis of Weak Relaxed Greedy Algorithms

  • Conference paper
  • First Online:

Part of the book series: Lecture Notes in Computer Science ((LNTCS,volume 10556))

Abstract

Many problems in machine learning can be presented in the form of convex optimization problems with objective function as a loss function. The paper examines two weak relaxed greedy algorithms for finding the solutions of convex optimization problems over convex hulls of atomic sets. Such problems arise as the natural convex relaxations of cardinality-type constrained problems, many of which are well-known to be NP-hard. Both algorithms utilize one atom from a dictionary per iteration, and therefore, guarantee designed sparsity of the approximate solutions. Algorithms employ the so called ‘gradient greedy step’ that maximizes a linear functional which uses gradient information of the element obtained in the previous iteration. Both algorithms are ‘weak’ in the sense that they solve the linear subproblems at the gradient greedy step only approximately. Moreover, the second algorithm employs an approximate solution at the line-search step. Following ideas of [5] we put up the notion of the duality gap, the values of which are computed at the gradient greedy step of the algorithms on each iteration, and therefore, they are inherent upper bounds for primal errors, i.e. differences between values of objective function at current and optimal points on each step. We obtain dual convergence estimates for the weak relaxed greedy algorithms.

This is a preview of subscription content, log in via an institution.

Buying options

Chapter
USD   29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD   39.99
Price excludes VAT (USA)
  • Available as EPUB and PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD   54.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Learn about institutional subscriptions

References

  1. Bubeck, S.: Convex optimization: algorithms and complexity. Found. Trends Mach. Learn. 8(3–4), 231–358 (2015)

    Article  MATH  Google Scholar 

  2. Frank, M., Wolfe, P.: An algorithm for quadratic programming. Naval Res. Logis. Quart. 3, 95–110 (1956)

    Article  MathSciNet  Google Scholar 

  3. Levitin, E.S., Polyak, B.T.: Constrained minimization methods. USSR Comp. Math. & M. Phys. 6(5), 1–50 (1966)

    Article  Google Scholar 

  4. Clarkson, K.L.: Coresets, sparse greedy approximation, and the Frank-Wolfe algorithm. ACM Trans. Algorithms 6(4), 1–30 (2010)

    Article  MATH  MathSciNet  Google Scholar 

  5. Jaggi, M., Frank-Wolfe, R.: Projection-free sparse convex optimization. In: Proceedings of the 30th International Conference on Machine Learning (ICML 2013), pp. 427–435 (2013)

    Google Scholar 

  6. Freund, R.M., Grigas, P.: New analysis and results for the Frank-Wolfe method. Math. Program. 155(1), 199–230 (2016)

    Article  MATH  MathSciNet  Google Scholar 

  7. Friedman, J.: Greedy function approximation: a gradient boosting machine. Ann. Stat. 29(5), 1189–1232 (2001)

    Article  MATH  MathSciNet  Google Scholar 

  8. Davis, G., Mallat, S., Avellaneda, M.: Adaptive greedy approximation. Constr. Approx. 13, 57–98 (1997)

    Article  MATH  MathSciNet  Google Scholar 

  9. Zhang, Z., Shwartz, S., Wagner, L., Miller, W.: A greedy algorithm for aligning DNA sequences. J. Comput. Biol. (1–2), 203–214 (2000)

    Google Scholar 

  10. Huber, P.J.: Projection pursuit. Ann. Statist. 13, 435–525 (1985)

    Article  MATH  MathSciNet  Google Scholar 

  11. Jones, L.: On a conjecture of Huber concerning the convergence of projection pursuit regression. Ann. Statist. 15, 880–882 (1987)

    Article  MATH  MathSciNet  Google Scholar 

  12. Barron, A.R., Cohen, A., Dahmen, W., DeVore, R.A.: Approximation and learning by Greedy algorithms. Ann. Stat. 36(1), 64–94 (2008)

    Article  MATH  MathSciNet  Google Scholar 

  13. DeVore, R.A., Temlyakov, V.N.: Some remarks on greedy algorithms. Adv. Comput. Math. 5, 173–187 (1996)

    Article  MATH  MathSciNet  Google Scholar 

  14. Konyagin, S.V., Temlyakov, V.N.: A remark on greedy approximation in Banach spaces. East J. Approx. 5(3), 365–379 (1999)

    MATH  MathSciNet  Google Scholar 

  15. Temlyakov, V.N.: Greedy approximation in convex optimization. Constr. Approx. 41(2), 269–296 (2015)

    Article  MATH  MathSciNet  Google Scholar 

  16. Nguyen, H., Petrova, G.: Greedy strategies for convex optimization. Calcolo 41(2), 1–18 (2016)

    MATH  Google Scholar 

  17. Temlyakov, V.N.: Dictionary descent in optimization. Anal. Math. 42(1), 69–89 (2016)

    Article  MATH  MathSciNet  Google Scholar 

  18. DeVore, R.A., Temlyakov, V.N.: Convex optimization on Banach spaces. Found. Comput. Math. 16(2), 369–394 (2016)

    Article  MATH  MathSciNet  Google Scholar 

  19. Temlyakov, V.N.: Convergence and rate of convergence of some greedy algorithms in convex optimization. Proc. Steklov Inst. Math. 293(1), 325–337 (2016)

    Article  MATH  MathSciNet  Google Scholar 

  20. Sidorov, S., Mironov, S., Pleshakov, M.: Dual greedy algorithm for conic optimization problem. CEUR Workshop Proc. 1623, 276–283 (2016)

    Google Scholar 

  21. Gao, D.Y.: On unified modeling, theory, and method for solving multi-scale global optimization problems. AIP Conference Proc. 1776(1), 0200051–0200058 (2016)

    Google Scholar 

Download references

Acknowledgments

This work was supported by the Russian Fund for Basic Research under Grant 16-01-00507. We would like to thank the reviewers profoundly for very helpful suggestions and commentaries.

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Sergei P. Sidorov .

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2017 Springer International Publishing AG

About this paper

Cite this paper

Sidorov, S.P., Mironov, S.V. (2017). Duality Gap Analysis of Weak Relaxed Greedy Algorithms. In: Battiti, R., Kvasov, D., Sergeyev, Y. (eds) Learning and Intelligent Optimization. LION 2017. Lecture Notes in Computer Science(), vol 10556. Springer, Cham. https://doi.org/10.1007/978-3-319-69404-7_18

Download citation

  • DOI: https://doi.org/10.1007/978-3-319-69404-7_18

  • Published:

  • Publisher Name: Springer, Cham

  • Print ISBN: 978-3-319-69403-0

  • Online ISBN: 978-3-319-69404-7

  • eBook Packages: Computer ScienceComputer Science (R0)

Publish with us

Policies and ethics