UnconstrainedProblems Package

Plotting Search Data

The utility functions FindMinimumPlot and FindRootPlot show search data for FindMinimum and FindRoot for one- and two-dimensional functions. They work with essentially the same arguments as FindMinimum and FindRoot except that they additionally take options, which affect the graphics functions they call to provide the plots, and they do not have the HoldAll attribute as do FindMinimum and FindRoot.

FindMinimumPlot[f,{x,xst},opts]plot the steps and the points at which the function f and any of its derivatives that were evaluated in FindMinimum[f,{x,xst}], superimposed on a plot of f versus x; opts may include options from both FindMinimum and Plot
FindMinimumPlot[f,{{x,xst},{y,yst}},opts]plot the steps and the points at which the function f and any of its derivatives that were evaluated in FindMinimum[f,{{x,xst},{y,yst}}], superimposed on a contour plot of f as a function of x and y; opts may include options from both FindMinimum and ContourPlot
FindRootPlot[f,{x,xst},opts]plot the steps and the points at which the function f and any of its derivatives that were evaluated in FindRoot[f,{x,xst}], superimposed on a plot of f versus x; opts may include options from both FindRoot and Plot
FindRootPlot[f,{{x,xst},{y,yst}},opts]plot the steps and the points at which the function f and any of its derivatives that were evaluated in FindRoot[f,{{x,xst},{y,yst}}], superimposed on a contour plot of the merit function f as a function of x and y; opts may include options from both FindRoot and ContourPlot

Plotting search data.

Note that to simplify processing and reduce possible confusion about the function f, FindRootPlot does not accept equations; it finds a root f=0.

Steps and evaluation points are color coded for easy detection as follows:

  • Steps are shown with blue lines and blue points.
  • Function evaluations are shown with green points.
  • Gradient evaluations are shown with red points.
  • Hessian evaluations are shown with cyan points.
  • Residual function evaluations are shown with yellow points.
  • Jacobian evaluations are shown with purple points.
  • The search termination is shown with a large black point.

FindMinimumPlot and FindRootPlot uses the default argument "Plot" to return a plot of the steps.

FindMinimumPlot and FindRootPlot using the argument "SolutionData" returns an Association with keys {Plot,Solution,Statistics}, where:

  • Plot is the graphics object shown.
  • Solution is the result of FindMinimum or FindRoot.
  • Statistics is a list of rules showing the number of steps and evaluations of the function and its derivatives.
This loads the package:
This shows in two dimensions the steps and evaluations used by FindMinimum to find a local minimum of the function starting at the point . Options are given to ContourPlot so that no contour lines are shown and the function value is indicated by grayscale. Since FindMinimum by default uses the quasi-Newton method, there are only evaluations of the function and gradient that occur at the same points, indicated by the red circles with green centers:
This shows in two dimensions the steps and evaluations used by FindMinimum to find a local minimum of the function starting at the point . Since the problem is a sum of squares, FindMinimum by default uses the GaussNewton/LevenbergMarquardt method that derives a residual function and only evaluates it and its Jacobian. Points at which the residual function is evaluated are shown with yellow dots. The yellow dots surrounded by a large purple circle are points at which the Jacobian was evaluated as well:
This shows in two dimensions the steps and evaluations used by FindMinimum to find a local minimum of the function starting at the point using Newton's method. Points at which the function, gradient, and Hessian were all evaluated are shown by concentric green, red, and cyan circles. Note that in this example, all the Newton steps satisfied the Wolfe conditions, so there were no points where the function and gradient were evaluated separately from the Hessian, which is not always the case. Note also that Newton's method finds a different local minimum than the default method:
This shows the steps and evaluations used by FindMinimum to find a local minimum of the function with two starting values superimposed on the plot of the function. Options are given to Plot so that the curve representing the function is thick and purple. With two starting values, FindMinimum uses the derivative-free principal axis method, so there are only function evaluations, indicated by the green dots:
This shows in two dimensions the steps and evaluations used by FindRoot to find a root of the function starting at the point . As described earlier, the function is a residual, and the default method in FindRoot evaluates the residual and its Jacobian as shown by the yellow dots and purple circles. Note that this plot is nearly the same as the one produced by FindMinimumPlot with the default method for the function since the residual is the same. FindRootPlot also shows the zero contour of each component of the residual function in red and green:

Test Problems

All the test problems presented in [MGH81] have been coded into the Wolfram Language in the Optimization`UnconstrainedProblems` package. A data structure is used so that the problems can be processed for solution and testing with FindMinimum and FindRoot in a seamless way. The lists of problems for FindMinimum and FindRoot are in $FindMinimumProblems and $FindRootProblems, respectively, and a problem can be accessed using GetFindMinimumProblem and GetFindRootProblem.

$FindMinimumProblemslist of problems that are appropriate for FindMinimum
GetFindMinimumProblem[prob]get the problem prob using the default size and starting values in a FindMinimumProblem data structure
GetFindMinimumProblem[prob,{n,m}]get the problem prob with n variables such that it is a sum of m squares in a FindMinimumProblem data structure
GetFindMinimumProblem[prob,size,start]
get the problem prob with given size and starting value start in a FindMinimumProblem data structure
FindMinimumProblem[f,vars,opts,prob,size]
a data structure that contains a minimization problem to be solved by FindMinimum

Accessing FindMinimum problems.

$FindRootProblemslist of problems that are appropriate for FindRoot
GetFindRootProblem[prob]get the problem prob using the default size and starting values in a FindRootProblem data structure
GetFindRootProblem[prob,n]get the problem prob with n variables (and n equations) in a FindRootProblem data structure
GetFindRootProblem[prob,n,start]get the problem prob with size n and starting value start in a FindRootProblem data structure
FindRootProblem[f,vars,opts,prob,size]
a data structure that contains a minimization problem to be solved by FindRoot

Accessing FindRoot problems.

GetFindMinimumProblem and GetFindRootProblem are both pass options to be used by other commands. They also accept the option Variables->vars which is used to specify what variables to use for the problems.

option name
default value
VariablesX#&a function that is applied to the integers to generate the variables for a problem with variables or a list of length containing the variables

Specifying variable names.

This loads the package:
This gets the Beale problem in a FindMinimumProblem data structure:
This gets the Powell singular function problem in a FindRootProblem data structure:

Once you have a FindMinimumProblem or FindRootProblem object, in addition to simply solving the problem, there are various tests that you can run.

ProblemSolve[p,opts]solve the problem in p, giving the same output as FindMinimum or FindRoot
ProblemStatistics[p,opts]solve the problem, giving a list {sol,stats}, where sol is the output of ProblemSolve[p] and evals is a list of rules indicating the number of steps and evaluations used
ProblemTime[p,opts]solve the problem giving a list {sol,Time->time}, where sol is the output of ProblemSolve[p] and time is time taken to solve the problem; if time is less than a second, the problem will be solved multiple times to get an average timing
ProblemTest[p,opts]solve the problem, giving a list of rules including the step and evaluation statistics and time from ProblemStatistics[p] and ProblemTime[p] along with rules indicating the accuracy and precision of the solution as compared with a reference solution
FindMinimumPlot[p,opts]plot the steps and evaluation points for solving a FindMinimumProblem p
FindRootPlot[p,opts]plot the steps and evaluation points for solving a FindRootProblem p

Operations with FindMinimumProblem and FindRootProblem data objects.

Any of the previous commands shown can take options that are passed on directly to FindMinimum or FindRoot and override any options for these functions which may have been specified when the problem was set up.

This uses FindRoot to solve the Powell singular function problem and gives the root:
This does the same as the previous example, but includes statistics on steps and evaluations required:
This uses FindMinimum to solve the Beale problem and averages the timing over several trials to get the average time it takes to solve the problem:
This uses FindMinimum to solve the Beale problem, compares the result with a reference solution, and gives a list of rules indicating the results of the test:

ProblemTest gives a way to easily compare two different methods for the same problem.

This uses FindMinimum to solve the Beale problem using Newton's method, compares the result with a reference solution, and gives a list of rules indicating the results of the test:

Most of the rules returned by these functions are self-explanatory, but a few require some description. Here is a table clarifying those rules.

"FunctionAccuracy"the accuracy of the function value -Log[10,error in f]
"FunctionPrecision"the precision of the function value -Log[10,relative error in f]
"SpatialAccuracy"the accuracy in the position of the minimizer or root -Log[10,error in x]
"SpatialPrecision"the precision in the position of the minimizer or root -Log[10,relative error in x]
"Messages"a list of messages issued during the solution of the problem

A very useful comparison is to see how a list of methods affect a particular problem. This is easy to do by setting up a FindMinimumProblem object and mapping a problem test over a list of methods.

This gets the Chebyquad problem. The output has been abbreviated to save space:
Here is a list of possible methods:
This makes a table comparing the different methods in terms of accuracy and computation time:

It is possible to generate tables of how a particular method affects a variety of problems by mapping over the names in $FindMinimumProblems or $FindRootProblems.

This sets up a function that tests a problem with FindMinimum using its default settings except with a large setting for MaxIterations so that the default (LevenbergMarquardt) method can run to convergence:
This makes a table showing some of the results from testing all the problems in $FindMinimumProblems. It may take several minutes to run:

The two cases where the spatial accuracy is shown as ERROR are for linear problems, which do not have an isolated minimizer. The one case, which has a spatial accuracy that is quite poor, has multiple minimizers, and the method goes to a different minimum than the reference one. Many of these functions have multiple local minima, so be aware that the error may be reported as large only because a method went to a different minimum than the reference one.