3.9.10 Advanced Topic: Monitoring and Selecting Algorithms
Functions in Mathematica are carefully set up so that you normally do not have to know how they work inside. But particularly for numerical functions that use iterative algorithms, it is sometimes useful to be able to monitor the internal progress of these algorithms.
Options for monitoring progress of numerical functions.
This prints the value of x every time a step is taken.
In[1]:= FindRoot[Cos[x] == x, {x, 1}, StepMonitor :> Print[x]]
Out[1]=
Note the importance of using option :> expr rather than option > expr. You need a delayed rule :> to make expr be evaluated each time it is used, rather than just when the rule is given.
Reap and Sow provide a convenient way to make a list of the steps taken.
In[2]:= Reap[FindRoot[Cos[x] == x, {x, 1}, StepMonitor :> Sow[x]]]
Out[2]=
This counts the steps.
In[3]:= Block[{ct = 0}, {FindRoot[Cos[x] == x, {x, 1}, StepMonitor :> ct++], ct}]
Out[3]=
To take a successful step towards an answer, iterative numerical algorithms sometimes have to do several evaluations of the functions they have been given. Sometimes this is because each step requires, say, estimating a derivative from differences between function values, and sometimes it is because several attempts are needed to achieve a successful step.
This shows the successful steps taken in reaching the answer.
In[4]:= Reap[FindRoot[Cos[x] == x, {x, 5}, StepMonitor :> Sow[x]]]
Out[4]=
This shows every time the function was evaluated.
In[5]:= Reap[FindRoot[Cos[x] == x, {x, 5}, EvaluationMonitor :> Sow[x]]]
Out[5]=
The pattern of evaluations done by algorithms in Mathematica can be quite complicated.
In[6]:= ListPlot[Reap[NIntegrate[1/Sqrt[x], {x, 1, 0, 1}, EvaluationMonitor :> Sow[x]]][[2, 1]]]
Out[6]=
Method options.
There are often several different methods known for doing particular types of numerical computations. Typically Mathematica supports most generally successful ones that have been discussed in the literature, as well as many that have not. For any specific problem, it goes to considerable effort to pick the best method automatically. But if you have sophisticated knowledge of a problem, or are studying numerical methods for their own sake, you may find it useful to tell Mathematica explicitly what method it should use. The Reference Guide lists some of the methods built into Mathematica; others are discussed in Section A.9.4 or in advanced or online documentation.
This solves a differential equation using method m, and returns the number of steps and evaluations needed.
In[7]:= try[m_] := Block[{s=e=0}, NDSolve[{y''[x] + Sin[y[x]] == 0, y'[0] == y[0] == 1}, y, {x, 0, 100}, StepMonitor :> s++, EvaluationMonitor :> e++, Method > m]; {s, e}]
With the method selected automatically, this is the number of steps and evaluations that are needed.
In[8]:= try[Automatic]
Out[8]=
This shows what happens with several other possible methods. The Adams method that is selected automatically is the fastest.
In[9]:= try /@ {"Adams", "BDF", "ExplicitRungeKutta", "ImplicitRungeKutta", "Extrapolation"}
Out[9]=
This shows what happens with the explicit RungeKutta method when the difference order parameter is changed.
In[10]:= Table[try[{"ExplicitRungeKutta", "DifferenceOrder" > n}], {n, 4, 9}]
Out[10]=
