Hard-Margin SVM
Learn how to implement and optimize the hard-margin SVM.
In this lesson, we explore Hard-margin SVM, a type of support vector machine designed to perfectly separate two classes with a maximum margin hyperplane. Hard-margin SVM is ideal for datasets that are linearly separable and free of outliers. We’ll start by understanding the concept of a linearly separable dataset and the idea of maximizing the margin. Then, we’ll carefully derive the equivalent optimization problem, discuss the critical role of support vectors, and implement hard-margin SVM using cvxpy to visualize the decision boundary and the support vectors.
Linearly separable case
Given a binary classification dataset , where and , if a hyperplane exists that separates the two classes, the dataset is said to be linearly separable in the feature space defined by the mapping . We’ll assume for now that the dataset is linearly separable. The goal is to find the hyperplane with a maximum margin by optimizing the following objective:
The direct optimization of the above objective is complex. Here is the derivation of the equivalent, simplified optimization problem:
...