In Sect. 2.10 we have presented the conjugate gradient algorithm derived from the Hestenes-Stiefel method. The method is globally convergent under the Wolfe line search rules. Its strong convergent properties are achieved by modifying the coefficient β k in such a way that it is no longer equivalent to the nonlinear Hestenes-Stiefel conjugate gradient algorithm. In Sect. 3.2 we show that the nonlinear Polak-Ribière method is equivalent to the nonlinear Hestenes-Stiefel algorithm provided that the directional minimization is exact. Having that in mind and the fact that Hager and Zhang do not stipulate condition (2.68) in Theorem 2.14 their main convergence result is remarkable.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
Preview
Unable to display preview. Download preview PDF.
Rights and permissions
Copyright information
© 2009 Springer-Verlag Berlin Heidelberg
About this chapter
Cite this chapter
(2009). Memoryless Quasi-Newton Methods. In: Conjugate Gradient Algorithms in Nonconvex Optimization. Nonconvex Optimization and Its Applications, vol 89. Springer, Berlin, Heidelberg. https://doi.org/10.1007/978-3-540-85634-4_3
Download citation
DOI: https://doi.org/10.1007/978-3-540-85634-4_3
Publisher Name: Springer, Berlin, Heidelberg
Print ISBN: 978-3-540-85633-7
Online ISBN: 978-3-540-85634-4
eBook Packages: Mathematics and StatisticsMathematics and Statistics (R0)