Introduction to Local Minimization
that is used to determine the next step. The FindMinimum function in the Wolfram Language has five essentially different ways of choosing this model, controlled by the method option. These methods are similarly used by FindMaximum and FindFit.
|"Newton"||use the exact Hessian or a finite difference approximation if the symbolic derivative cannot be computed|
|"QuasiNewton"||use the quasi-Newton BFGS approximation to the Hessian built up by updates based on past steps|
|"LevenbergMarquardt"||a Gauss–Newton method for least-squares problems; the Hessian is approximated by , where is the Jacobian of the residual function|
|"ConjugateGradient"||a nonlinear version of the conjugate gradient method for solving linear systems; a model Hessian is never formed explicitly|
|"PrincipalAxis"||works without using any derivatives, not even the gradient, by keeping values from past steps; it requires two starting conditions in each variable|
Basic method choices for FindMinimum.
With Method->Automatic, the Wolfram Language uses the quasi-Newton method unless the problem is structurally a sum of squares, in which case the Levenberg–Marquardt variant of the Gauss–Newton method is used. When given two starting conditions in each variable, the principal axis method is used.