Global Convergence of the Dai-Yuan Conjugate Gradient Method with Perturbations
- 56 Downloads
In this paper, the authors propose a class of Dai-Yuan (abbr. DY) conjugate gradient methods with linesearch in the presence of perturbations on general function and uniformly convex function respectively. Their iterate formula is x k+1 = x k + α k (s k + ω k ), where the main direction s k is obtained by DY conjugate gradient method, ω k is perturbation term, and stepsize α k is determined by linesearch which does not tend to zero in the limit necessarily. The authors prove the global convergence of these methods under mild conditions. Preliminary computational experience is also reported.
KeywordsConjugate gradient method global convergence perturbation uniformly convex
Unable to display preview. Download preview PDF.
- 2.M. J. D. Powell, Restart procedures of the conjugate gradient method, Math. Program., 1997, 2: 241–245.Google Scholar
- 3.M. J. D. Powell, Non-convex minimization calculations and the conjugate gradient method, Lecture Notes in Math., Springer-Verlag, Berlin, 1984, 1066: 122–141.Google Scholar
- 5.Y. H. Dai and Y. Yuan, Some properties of a new conjugate gradient method, in Advances in Nonlinear Programing (ed. by Y. Yuan), Kluwer, Boston, 1998, 251–262.Google Scholar
- 6.B. T. Poljak and Y. Z. Tsypkin, Pseudogradient adaptation and training algorithms, Automat. Remote Control, 1973, 12: 83–94.Google Scholar
- 11.O. L. Mangasarian and M. V. Solodov, Backpropagation convergence via deternministic nonmonotone perturbed minimization, in Advances in Neural Information Processing Systems 6 (ed. by G. Tesauro, J. D. Cowan, and J. Alspector), Morgan Kaufmann, San Francisco, 1994, 383–390.Google Scholar