This is documentation for Mathematica 6, which was
based on an earlier version of the Wolfram Language.
 Mathematica Tutorial

# Introduction to Local Minimization

The essence of most methods is in the local quadratic model
that is used to determine the next step. The FindMinimum function in Mathematica has five essentially different ways of choosing this model, controlled by the method option. These methods are similarly used by FindMaximum and FindFit.
 "Newton" us the exact Hessian or a finite difference approximation if the symbolic derivative cannot be computed "QuasiNewton" use the quasi-Newton BFGS approximation to the Hessian built up by updates based on past steps "LevenbergMarquardt" a Gauss-Newton method for least-squares problems; the Hessian is approximated by JTJ, where J is the Jacobian of the residual function "ConjugateGradient" a nonlinear version of the conjugate gradient method for solving linear systems; a model Hessian is never formed explicitly. "PrincipalAxis" works without using any derivatives, not even the gradient, by keeping values from past steps; it requires two starting conditions in each variable

Basic method choices for FindMinimum.

With , Mathematica uses the quasi-Newton method unless the problem is structurally a sum of squares, in which case the Levenberg-Marquardt variant of the Gauss-Newton method is used. When given two starting conditions in each variable, the "principal axis" method is used.