Second-Order Optimisation Methods

  • Adrian J. Shepherd
Part of the Perspectives in Neural Computing book series (PERSPECT.NEURAL)


With the essential background information of Chapter 2 behind us, we are ready to turn our attention to specific second-order optimisation methods. The survey of multivariate second-order methods presented in this chapter is necessarily selective, with the focus on tried and tested methods that have a reputation for both speed and reliability. Two types of multivariate second-order method are considered — general methods (Section 3.3), and nonlinear least-squares methods (Section 3.4). The former are suitable for finding a minimum of any smooth nonlinear function F(x k ); the latter are suitable only when F(x k )is of the special form given by (1.5).


Line Search Conjugate Gradient Method Newton Step Steep Descent Direction Exact Line Search 
These keywords were added by machine and not by the authors. This process is experimental and the keywords may be updated as the learning algorithm improves.


Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

Copyright information

© Springer-Verlag London Limited 1997

Authors and Affiliations

  • Adrian J. Shepherd
    • 1
  1. 1.Department of Biochemistry and Molecular BiologyUniversity College LondonLondonUK

Personalised recommendations