# Newton's Method

Newton's method for nonlinear equations is based on a linear approximation

so the Newton step is found simply by setting ,

Near a root of the equations, Newton's method has -quadratic convergence, similar to Newton's method for minimization. Newton's method is used as the default method for FindRoot.

Newton's method can be used with either line search or trust region step control. When it works, the line search control is typically faster, but the trust region approach is usually more robust.

This loads a package that contains some utility functions.

Here is the Rosenbrock problem as a

FindRoot problem.

Out[2]= | |

This finds the solution of the nonlinear system using the default line search approach. (Newton's method is the default method for

FindRoot.)

Out[3]= | |

Note that each of the line searches started along the line . This is a particular property of the Newton step for this particular problem.

This computes the Jacobian and the Newton step symbolically for the Rosenbrock problem.

Out[4]= | |

When this step is added to the point , it is easy to see why the steps go to the line . This is a particular feature of this problem, which is not typical for most functions. Because the trust region approach does not try the Newton step unless it lies within the region bound, this feature does not show up so strongly when the trust region step control is used.

This finds the solution of the nonlinear system using the trust region approach. The search is almost identical to the search with the

Gauss-Newton method for the Rosenbrock objective function in

FindMinimum.

Out[5]= | |

When the structure of the Jacobian matrix is sparse, *Mathematica* will use SparseArray objects both to compute the Jacobian and to handle the necessary numerical linear algebra.

When solving nonlinear equations is used as a part of a more general numerical procedure, such as solving differential equations with implicit methods, often starting values are quite good, and complete convergence is not absolutely necessary. Often the most expensive part of computing a Newton step is finding the Jacobian and computing a matrix factorization. However, when close enough to a root, it is possible to leave the Jacobian frozen for a few steps (though this does certainly affect the convergence rate). You can do this in *Mathematica* using the method option , which gives the number of steps to go before updating the Jacobian. The default is , so the Jacobian is updated every step.

This shows the number of steps, function evaluations, and Jacobian evaluations required to find a simple square root when the Jacobian is only updated every three steps.

Out[6]= | |

This shows the number of steps, function evaluations, and Jacobian evaluations required to find a simple square root when the Jacobian is updated every step.

Out[7]= | |

Of course for a simple one-dimensional root, updating the Jacobian is trivial in cost, so holding the update is only of use here to demonstrate the idea.

| | |

"UpdateJacobian" | 1 | number of steps to take before updating the Jacobian |

"StepControl" | "LineSearch" | method for step control, can be , , or None (which is not recommended) |

Method options for Method->"Newton" in FindRoot.