SciPy is an open-source Python library dedicated to scientific computation. The optimize
package in SciPy provides several common optimization algorithms such as least squares, minimization, curve fitting, etc.
optimize.linprog()
functionThe optimize.linprog()
function is from the domain of
scipy.optimize.linprog(c, A_ub = None, b_ub = None, A_eq = None,b_eq = None, bounds = None,method = 'interior-point', callback = None,options = None, x0 = None)
The function optimize.linprog()
accepts the following parameters:
c
: This is a one-dimensional array representing the coefficients of the linear objective function.
A_ub
: This is a two-dimensional array representing the inequality constraint matrix. Each row of the matrix represents the coefficients of a linear inequality.
b_ub
: This is a one-dimensional array representing the inequality constraint vector.
A_eq
: This is a two-dimensional array representing the equality constraint matrix. Each row of the matrix represents the coefficients of a linear equality.
b_eq
: This is a one-dimensional array representing the equality constraint vector.
bounds
: This is a sequence of the (min,max)
pair defining the minimum and maximum value of the decision variable.
callback
: This is an optional function argument. It is invoked on every iteration.
method
: This is the algorithm used to solve the standard form problem.
options
: This is a dictionary of solver options.
x0
: This is a one-dimensional array that represents the guess values of the decision variables.
This function returns a solution represented by the OptimizeResult
object. As part of the object, the following components are returned.
x
: This contains the values of the decision variables that minimize the objective function while meeting the defined constraints.
fun
: This tells the optimal value of the objective function.
slack
: This tells the (nominally positive) values of the slack variables. Slack variables are the differences between the values of the left and right sides of the constraints.
con
: This represents the equality constraints residuals.
status
: This is a value between 0-4
that represents the exit status of the algorithm.
success
: This tells us that the algorithm has found an optimal solution.
nit
: This tells us the total number of iterations in all phases.
message
: This displays the message produced on the algorithm’s termination.
Let’s see how we can use the optimize.linprog()
function to solve the following minimization problem.
such that,
import numpy as npfrom scipy.optimize import linprog# Setting the inequality constraints matrixA = np.array([[-1, -1, -1],[-1, 2, 0],[0, 0, -1],[-1, 0, 0],[0, -1, 0],[0, 0, -1]])# Setting the inequality constraints vectorb = np.array([-1000, 0, -340, 0, 0, 0])# Setting the coefficients of the linear objective function vectorc = np.array([10, 15, 25])# Solving linear programming problemres = linprog(c, A_ub=A, b_ub=b)print('Optimal value:', round(res.fun, ndigits=2),'\nx values:', res.x,'\nNumber of iterations performed:', res.nit,'\nStatus:', res.message)
Lines 5 to 10: We set the inequality constraints as A
. Since the function requires the constraints to be in the form of <=
, we multiply the inequalities by -
.
Line 13: We initialize the inequality constraints vector as b
.
Line 16: We set the coefficients of the objective function as c
.
Lines 19 to 24: Finally, we store the result of the linprog()
function in res
and print the various components of the res
object.