This is documentation for Mathematica 5, which was
based on an earlier version of the Wolfram Language.
View current documentation (Version 11.1)

Documentation / Mathematica / The Mathematica Book / Advanced Mathematics in Mathematica / Numerical Operations on Functions /

3.9.8 Numerical Optimization

Searching for minima and maxima.

This finds the value of which minimizes , starting from .

In[1]:= FindMinimum[Gamma[x], {x, 2}]

Out[1]=

The last element of the list gives the value at which the minimum is achieved.

In[2]:= Gamma[x] /. Last[%]

Out[2]=

Like FindRoot, FindMinimum and FindMaximum work by starting from a point, then progressively searching for a minimum or maximum. But since they return a result as soon as they find anything, they may give only a local minimum or maximum of your function, not a global one.

This curve has two local minima.

In[3]:= Plot[x^4 - 3x^2 + x, {x, -3, 2}]

Out[3]=

Starting at , you get the local minimum on the right.

In[4]:= FindMinimum[x^4 - 3 x^2 + x, {x, 1}]

Out[4]=

This gives the local minimum on the left, which in this case is also the global minimum.

In[5]:= FindMinimum[x^4 - 3 x^2 + x, {x, -1}]

Out[5]=

Finding global minima and maxima.

This immediately finds the global minimum.

In[6]:= NMinimize[x^4 - 3x^2 + x, x]

Out[6]=

NMinimize and NMaximize are numerical analogs of Minimize and Maximize. But unlike Minimize and Maximize they usually cannot guarantee to find absolute global minima and maxima. Nevertheless, they typically work well when the function f is fairly smooth, and has a limited number of local minima and maxima.

Finding global minima and maxima subject to constraints.

With the constraint x > 0, NMinimize will give the local minimum on the right.

In[7]:= NMinimize[{x^4 - 3x^2 + x, x > 0}, x]

Out[7]=

This finds the minimum of x + 2y within the unit circle.

In[8]:= NMinimize[{x + 2y, x^2 + y^2 <= 1}, {x, y}]

Out[8]=

In this case Minimize can give an exact result.

In[9]:= Minimize[{x + 2y, x^2 + y^2 <= 1}, {x, y}]

Out[9]=

But in this case it cannot.

In[10]:= Minimize[{Cos[x + 2y], x^2 + y^2 <= 1}, {x, y}]

Out[10]=

This gives a numerical approximation, effectively using NMinimize.

In[11]:= N[%]

Out[11]=

If both the objective function f and the constraints cons are linear in all variables, then minimization and maximization correspond to a linear programming problem. Sometimes it is convenient to state such problems not in terms of explicit equations, but instead in terms of matrices and vectors.

Linear programming in matrix form.

Here is a linear programming problem in equation form.

In[12]:= Minimize[{2x + 3y, x + 5y >= 10, x - y >= 2, x >= 1}, {x, y}]

Out[12]=

Here is the corresponding problem in matrix form.

In[13]:= LinearProgramming[{2, 3}, {{1, 5}, {1, -1}, {1, 0}},
{10, 2, 1}]

Out[13]=

You can specify a mixture of equality and inequality constraints by making the list b be a sequence of pairs , . If is 1, then the i constraint is . x . If is 0 then it is . x == , and if is -1 then it is . x .

This makes the first inequality use .

In[14]:= LinearProgramming[{2, 3}, {{1, 5}, {1, -1}, {1, 0}},
{{10, -1}, {2, 1}, {1, 1}}]

Out[14]=

In LinearProgramming[c, m, b, l], you can make l be a list of pairs , , , , ... representing lower and upper bounds on the .

In doing large linear programming problems, it is often convenient to give the matrix m as a SparseArray object.