It is worthwhile to notice that when interests in conjugate gradient algorithms for quadratic problems subsided their versions for nonconvex differentiable problems were proposed. These propositions relied on the simplicity of their counterparts for quadratic problems. As we have shown in the previous chapter a conjugate gradient algorithm is an iterative process which requires at each iteration the current gradient and the previous direction. The simple scheme for calculating the current direction was easy to extend to a nonquadratic problem
.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
Preview
Unable to display preview. Download preview PDF.
Rights and permissions
Copyright information
© 2009 Springer-Verlag Berlin Heidelberg
About this chapter
Cite this chapter
(2009). Conjugate Gradient Methods for Nonconvex Problems. In: Conjugate Gradient Algorithms in Nonconvex Optimization. Nonconvex Optimization and Its Applications, vol 89. Springer, Berlin, Heidelberg. https://doi.org/10.1007/978-3-540-85634-4_2
Download citation
DOI: https://doi.org/10.1007/978-3-540-85634-4_2
Publisher Name: Springer, Berlin, Heidelberg
Print ISBN: 978-3-540-85633-7
Online ISBN: 978-3-540-85634-4
eBook Packages: Mathematics and StatisticsMathematics and Statistics (R0)