Search⌘ K
AI Features

Hard-Margin SVM

Explore hard-margin support vector machines to understand how to perfectly separate two classes with a maximum margin hyperplane. Learn the mathematical derivation of the optimization problem, the role of support vectors, and implement the model using Python's cvxpy library. Gain insights on visualizing decision boundaries and prepare to advance toward soft-margin and kernelized SVMs.

In this lesson, we explore Hard-margin SVM, a type of support vector machine designed to perfectly separate two classes with a maximum margin hyperplane. Hard-margin SVM is ideal for datasets that are linearly separable and free of outliers. We’ll start by understanding the concept of a linearly separable dataset and the idea of maximizing the margin. Then, we’ll carefully derive the equivalent optimization problem, discuss the critical role of support vectors, and implement hard-margin SVM using cvxpy to visualize the decision boundary and the support vectors.

Linearly separable case

Given a binary classification dataset D={(x1,y1),(x2,y2),,(xn,yn)}D=\{(\bold x_1, y_1), (\bold x_2, y_2), \dots,(\bold x_n, y_n)\}, where xiRd\bold x_i \in \R^d and yi{1,1}y_i \in \{-1, 1\} ...