Skip to main content

Nesterov Step Reduced Gradient Algorithm for Convex Programming Problems

  • Conference paper
  • First Online:
Big Data and Networks Technologies (BDNT 2019)

Part of the book series: Lecture Notes in Networks and Systems ((LNNS,volume 81))

Included in the following conference series:

Abstract

In this paper, we proposed an implementation of method of speed reduced gradient algorithm for optimizing a convex differentiable function subject to linear equality constraints and nonnegativity bounds on the variables. In particular, at each iteration, we compute a search direction by reduced gradient, and line search by bisection algorithm or Armijo rule. Under some assumption, the convergence rate of speed reduced gradient (SRG) algorithm is proven to be significantly better, both theoretically and practically. The algorithm of SRG are programmed by Matlab, and comparing by Frank-Wolfe algorithm some problems, the numerical results which show the efficient of our approach, we give also an application to ODE, optimal control, image and video co-localization and learning machine.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 149.00
Price excludes VAT (USA)
  • Available as EPUB and PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 199.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

References

  1. Baushev, A.N., Morozova, E.Y.: A multidimensional bisection method for minimizing function over simplex. In: Lectures Notes in Engineering and Computer Science, vol. 2, pp. 801–803 (2007)

    Google Scholar 

  2. Bertsekas, D.P.: Nonlinear programming, 2nd edn. Athena Scientific, Belmont (1999)

    MATH  Google Scholar 

  3. Dembo, R.S.: Dealing with degeneracy in reduced gradient algorithms. Math. Program. 31, 363–375 (1985)

    Article  MathSciNet  Google Scholar 

  4. El Mouatasim, A., Ellaia, R., Al-Hossain, A.: A continuous approach to combinatorial optimization: application to water system pump operations. Optim. Lett. J. 6(1), 177–198 (2012)

    Article  MathSciNet  Google Scholar 

  5. El Mouatasim, A., Ellaia, R., Souza de Cursi, J.E.: Stochastic perturbation of reduced gradient & GRG methods for nonconvex programming problems. J. Appl. Math. Comput. 226, 198–211 (2014)

    Article  MathSciNet  Google Scholar 

  6. El Mouatasim, A.: Implementation of reduced gradient with bisection algorithms for non-convex optimization problem via stochastic perturbation. J. Numer. Algorithms 78(1), 41–62 (2018)

    Article  MathSciNet  Google Scholar 

  7. Floudas, C.A., Pardalos, P.M.: A collection of test problems for constrained global optimization algorithms. In: Lecture Notes in Computer Science, vol. 455. Springer-Verlag, Berlin (1990)

    Google Scholar 

  8. Hock, W., Schittkowski, K.: Test examples for nonlinear programming codes. In: Lecture Notes in Economics and Mathematical Systems, vol. 187. Springer (1981)

    Google Scholar 

  9. Joulin, A., Tang, K., Fei-Fei, L.: Efficient image and video co-localization with Frank-Wolfe algorithm. In: European Conference on Computer Vision (ECCV) (2014)

    Chapter  Google Scholar 

  10. Lacoste-Julien, S., Jaggi, M., Schmidt, M., Pletscher, P.: Block-coordinate Frank-Wolfe optimization for structural SVMs (2013)

    Google Scholar 

  11. Kiwiel, K.C.: Proximity control in bundle methods for convex nondifferentiable minimization. Math. Program. 46, 105–122 (1990)

    Article  MathSciNet  Google Scholar 

  12. Martinez, J.M., Pilotta, E.A., Raydan, M.: Spectral gradient methods for linearly constrained optimization. J. Optim. Theory Appl. 125(3), 629–651 (2005)

    Article  MathSciNet  Google Scholar 

  13. Nanuclef, R., Frandi, E., Sartori, C., Allende, H.: A novel Frank-Wolfe algorithm. Analysis and applications to large-scale SVM training. Inf. Sci. 285, 66–99 (2014)

    Article  MathSciNet  Google Scholar 

  14. Nesterov, Y.E.: A method for solving the convex programming problem with convergence rate O(1/\(k^2\)). Dokl. Akad. Nauk SSSR 269, 543–547 (1983)

    MathSciNet  Google Scholar 

  15. Penêdo de Carvalho, E., Júnior, A., Fu Ma, T.: Reduced gradient method combined with augmented Lagrangian and barrier for the optimal power flow problem. Appl. Math. Comput. 200, 529–536 (2008)

    MathSciNet  MATH  Google Scholar 

  16. Wang, F., Su, C., Liu, Y.: Computation of optimal feedforward and feedback control by a modified reduced gradient method. Appl. Math. Comput. 104, 85–100 (1999)

    MathSciNet  MATH  Google Scholar 

  17. Wolfe, P.: The Reduced Gradient Method. Rand Document, Santa Monica (1962)

    Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Abdelkrim El Mouatasim .

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2020 Springer Nature Switzerland AG

About this paper

Check for updates. Verify currency and authenticity via CrossMark

Cite this paper

El Mouatasim, A., Farhaoui, Y. (2020). Nesterov Step Reduced Gradient Algorithm for Convex Programming Problems. In: Farhaoui, Y. (eds) Big Data and Networks Technologies. BDNT 2019. Lecture Notes in Networks and Systems, vol 81. Springer, Cham. https://doi.org/10.1007/978-3-030-23672-4_11

Download citation

Publish with us

Policies and ethics