LinearOptimization

LinearOptimization[f,cons,vars]

finds values of variables vars that minimize the linear objective f subject to linear constraints cons.

LinearOptimization[c,{a,b}]

finds a vector x that minimizes the linear objective subject to the linear inequality constraints .

LinearOptimization[c,{a,b},{aeq,beq}]

includes the linear equality constraints .

LinearOptimization[,"prop"]

specifies what solution property "prop" should be returned.

Details and Options    • Linear optimization is also known as linear programming (LP).
• Linear optimization is a convex optimization problem that can be solved globally and efficiently.
• Linear optimization finds that solves the primal problem: »
•  minimize subject to constraints where • • The constraints cons can be specified by:
•  LessEqual scalar inequality GreaterEqual scalar inequality VectorLessEqual vector inequality VectorGreaterEqual vector inequality Equal scalar or vector equality Element convex domain or region element
• With LinearOptimization[f,cons,vars], parameter equations of the form parval, where par is not in vars and val is numerical or an array with numerical values, may be included in the constraints to define parameters used in f or cons.
• The primal minimization problem has a related maximization problem that is the Lagrangian dual problem. The dual maximum value is always less than or equal to the primal minimum value, so it provides a lower bound. The dual maximizer provides information about the primal problem, including sensitivity of the minimum value to changes in the constraints.
• The Lagrangian dual problem for linear optimization is given by: »
•  maximize subject to constraints where • For linear optimization, strong duality always holds, meaning that if there is a solution to the primal minimization problem, then there is a solution to the dual maximization problem, and the dual maximum value is equal to the primal minimum value.
• The possible solution properties "prop" include:
•  "PrimalMinimizer" a list of variable values that minimizes the objective function "PrimalMinimizerRules" values for the variables vars={v1,…} that minimizes "PrimalMinimizerVector" the vector that minimizes "PrimalMinimumValue" the minimum value "DualMaximizer" the vector that maximizes "DualMaximumValue" the dual maximum value "DualityGap" the difference between the dual and primal optimal values (0 because of strong duality) "Slack" the constraint slack vector "ConstraintSensitivity" sensitivity of to constraint perturbations "ObjectiveVector" the linear objective vector "LinearInequalityConstraints" the linear inequality constraint matrix and vector "LinearEqualityConstraints" the linear equality constraint matrix and vector {"prop1","prop2",…} several solution properties
• The following options can be given:
•  MaxIterations Automatic maximum number of iterations to use Method Automatic the method to use PerformanceGoal \$PerformanceGoal aspects of performance to try to optimize Tolerance Automatic the tolerance to use for internal comparisons WorkingPrecision Automatic precision to use in internal computations
• The option Method->method may be used to specify the method to use. Available methods include:
•  Automatic choose the method automatically "Simplex" simplex method "RevisedSimplex" revised simplex method "InteriorPoint" interior point method (machine precision only) "CLP" COIN library linear programming (machine precision only)
• With , the precision is taken automatically from the precision of the input arguments unless a method is specified that only works with machine precision, in which case machine precision is used.

Examples

open all close all

Basic Examples(2)

Minimize subject to the constraints :

 In:= Out= The optimal point lies in a region defined by the constraints and where is smallest within the region:

 In:= Out= Minimize subject to the constraints :

 In:= Out= Use the equivalent matrix representation:

 In:= Out= Possible Issues(3)

Introduced in 2019
(12.0)