Dual Formulation
Learn dual formulation and derive Lagrangian dual function for hard and soft-margin SVM.
Why dual formulation?
The dual formulation of SVM is important because it allows us to kernelize the optimization problem, making it easier to find a hyperplane that separates the classes in a higher-dimensional feature space. Additionally, the dual formulation provides better visibility of the solution’s sparsity, which explains SVM’s strong generalization ability in high-dimensional feature spaces.
Lagrangian dual of hard-margin SVM
In the case of hard-margin SVM, we have a constrained optimization problem where we want to find the parameters that minimize the objective function subject to the constraint that all training samples are classified correctly with a margin of at least 1. This can be written as:
We introduce a Lagrange multiplier for each constraint to apply the Lagrangian method. We call it and write the Lagrangian as:
Here, the Lagrange multipliers enforce the constraints by penalizing the objective function for any violations. If a constraint is violated, the corresponding Lagrange multiplier will be non-zero, which will increase the value of the objective function.
We can then form the Lagrangian dual function by minimizing the Lagrangian to the primal variable while maximizing it to the Lagrange multipliers :
Here, is the new optimization problem, where we only need to maximize over the Lagrange multipliers . The solution to this problem gives us the optimal set of Lagrange multipliers, which can be used to derive the optimal set of parameters for the original problem.
Explanation of
Since must be maximized with respect to , we can interpret as the penalty for violating the constraint. Whenever the constraint is violated, becomes negative, and maximizing results in adding a large positive number to . Simultaneously, minimizes by ensuring that the constraints are not violated.
Lagrangian dual of soft-margin SVM
In the case of soft-margin SVM, our optimization problem and constraints are as follows:
...