Convex Optimization

Convex optimization is the problem of minimizing a convex function over convex constraints. It is a class of problems for which there are fast and robust optimization algorithms, both in theory and in practice. Following the pattern for linear optimization, ever wider classes of problems are being identified to be in this class in a wide variety of domains, such as statistics, finance, signal processing, geometry and many more. The new classification of optimization problems is now convex and nonconvex optimization. The Wolfram Language provides the major convex optimization classes, their duals and sensitivity to constraint perturbation. The classes are extensively exemplified and should also provide a learning tool. The general optimization functions automatically recognize and transform a wide variety of problems into these optimization classes. Problem constraints can be compactly modeled using vector variables and vector inequalities.

Convex Optimization Classes

LinearOptimization minimize

LinearFractionalOptimization minimize

QuadraticOptimization minimize

SecondOrderConeOptimization minimize

SemidefiniteOptimization minimize

ConicOptimization minimize

Vector Inequality Constraints

VectorGreaterEqual partial ordering for vectors and matrices

VectorLessEqual  ▪  VectorGreater  ▪  VectorLess

General Convex & Nonconvex Optimization »

FindMinimum numerical local constrained optimization

FindMaximum  ▪  FindMinValue  ▪  FindMaxValue  ▪  FindArgMin  ▪  FindArgMax

NMinimize numerical global constrained optimization

NMaximize  ▪  NMinValue  ▪  NMaxValue  ▪  NArgMin  ▪  NArgMax

Minimize symbolic global constrained optimization

Maximize  ▪  MinValue  ▪  MaxValue  ▪  ArgMin  ▪  ArgMax