Abstract
Having a very simple computational scheme with a very well elaborated convergence theory and requiring modest computational resources for their implementation in computer codes, the conjugate gradient methods are of prime importance for solving large-scale unconstrained optimization problems and real applications.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
Author information
Authors and Affiliations
Corresponding author
Rights and permissions
Copyright information
© 2020 Springer Nature Switzerland AG
About this chapter
Cite this chapter
Andrei, N. (2020). Discussions, Conclusions, and Large-Scale Optimization. In: Nonlinear Conjugate Gradient Methods for Unconstrained Optimization. Springer Optimization and Its Applications, vol 158. Springer, Cham. https://doi.org/10.1007/978-3-030-42950-8_12
Download citation
DOI: https://doi.org/10.1007/978-3-030-42950-8_12
Published:
Publisher Name: Springer, Cham
Print ISBN: 978-3-030-42949-2
Online ISBN: 978-3-030-42950-8
eBook Packages: Mathematics and StatisticsMathematics and Statistics (R0)