Convex Optimization

Discover the power of convex optimization, including its standard form, important properties, and code examples.

What is convex optimization?

Convex optimization is a mathematical optimization technique that optimizes problems with convex objective functions and constraints.

Standard form

The standard form of a convex optimization problem is as follows:

minxf0(x),2,,mgj(x)=0j=1,2,,k\begin{aligned} \min_{\bold x} \quad & f_0(\bold x)\\ \textrm{s.t.} \quad & f_i(\bold x)\le0 \quad & i=1,2,\dots,m\\ \quad & g_j(\bold x)=0 \quad & j=1,2,\dots,k \\ \end{aligned}

Here, the objective function f0f_0 and the inequality constraints fif_i are all convex, and the equality constraints gjg_j are linear.

What is a convex function?

A convex function is a real-valued function whose graph lies above the line segmentThis means that the function is curving upward and doesn’t have any dips or humps that would cause the line segment to dip below the graph. connecting any two points on the graph. This property ensures that the function has a unique global minimum, making it useful in optimization problems. Convex functions are widely used in various fields, including mathematics, economics, physics, and engineering, due to their simplicity and tractability in modeling real-world phenomena. In machine learning, convex functions are commonly used as objective functionscvxML in optimization problems, where the goal is to find the values of parameters that minimize or maximize the function.

Get hands-on with 1200+ tech skills courses.