Dual Formulation
Explore the dual formulation of Support Vector Machines, understanding how it converts complex primal problems into simpler dual ones using Lagrangian multipliers. Learn why this approach facilitates kernel methods, reveals sparse solutions, and supports efficient model predictions in high-dimensional spaces.
We'll cover the following...
Support vector machines become far more powerful when we move beyond linear boundaries, and kernels make this possible without ever computing features explicitly. Kernel SVMs map data into high-dimensional spaces where complex patterns become separable, all while keeping computations efficient through the Gram matrix.
In this lesson, we will explore how kernels work in the dual formulation and implement them using cvxpy and sklearn-style functions.
We will also examine an important property of SVMs, sparsity, which explains why SVMs generalize well even in very high-dimensional spaces.
Kernels in SVM
The dual formulation straightforwardly offers kernelization of SVM. As we notice in the following dual optimization problem, the Gram matrix can be computed using any kernel function:
...