# Numerical Optimization

FindMinimum[f,{x,x_{0}}] | search for a local minimum of f, starting at |

FindMinimum[f,x] | search for a local minimum of f |

FindMinimum[f,{{x,x_{0}},{y,y_{0}},...}] | |

search for a local minimum in several variables | |

FindMinimum[{f,cons},{{x,x_{0}},{y,y_{0}},...}] | |

search for a local minimum subject to the constraints cons starting at , , ... | |

FindMinimum[{f,cons},{x,y,...}] | search for a local minimum subject to the constraints cons |

FindMaximum[f,x], etc. | search for a local maximum |

Searching for local minima and maxima.

In[1]:= |

Out[1]= |

In[2]:= |

Out[2]= |

Like FindRoot, FindMinimum and FindMaximum work by starting from a point, then progressively searching for a minimum or maximum. But since they return a result as soon as they find anything, they may give only a local minimum or maximum of your function, not a global one.

In[3]:= |

Out[3]= |

In[4]:= |

Out[4]= |

In[5]:= |

Out[5]= |

In[6]:= |

Out[6]= |

In[7]:= |

Out[7]= |

NMinimize[f,x] | try to find the global minimum of f |

NMinimize[f,{x,y,...}] | try to find the global minimum over several variables |

NMaximize[f,x] | try to find the global maximum of f |

NMaximize[f,{x,y,...}] | try to find the global maximum over several variables |

Finding global minima and maxima.

In[8]:= |

Out[8]= |

NMinimize and NMaximize are numerical analogs of Minimize and Maximize. But unlike Minimize and Maximize they usually cannot guarantee to find absolute global minima and maxima. Nevertheless, they typically work well when the function f is fairly smooth, and has a limited number of local minima and maxima.

NMinimize[{f,cons},{x,y,...}] | try to find the global minimum of f subject to constraints cons |

NMaximize[{f,cons},{x,y,...}] | try to find the global maximum of f subject to constraints cons |

Finding global minima and maxima subject to constraints.

In[9]:= |

Out[9]= |

In[10]:= |

Out[10]= |

In[11]:= |

Out[11]= |

In[12]:= |

Out[12]= |

In[13]:= |

Out[13]= |

If both the objective function f and the constraints cons are linear in all variables, then minimization and maximization correspond to a *linear programming problem*. Sometimes it is convenient to state such problems not in terms of explicit equations, but instead in terms of matrices and vectors.

LinearProgramming[c,m,b] | find the vector which minimizes subject to the constraints and |

LinearProgramming[c,m,b,l] | use the constraints and |

Linear programming in matrix form.

In[14]:= |

Out[14]= |

In[15]:= |

Out[15]= |

You can specify a mixture of equality and inequality constraints by making the list be a sequence of pairs . If is , then the constraint is . If is then it is , and if is then it is .

In[16]:= |

Out[16]= |

In LinearProgramming[c, m, b, l], you can make be a list of pairs representing lower and upper bounds on the .

In doing large linear programming problems, it is often convenient to give the matrix as a SparseArray object.