Search⌘ K
AI Features

Hard-Margin SVM

Learn to identify and implement hard-margin SVM models that find the maximum margin hyperplane for perfectly separable datasets. This lesson covers the mathematical formulation, the role of support vectors, convex optimization using cvxpy, and visualization of decision boundaries, preparing you for advanced SVM concepts.

In this lesson, we explore Hard-margin SVM, a type of support vector machine designed to perfectly separate two classes with a maximum margin hyperplane. Hard-margin SVM is ideal for datasets that are linearly separable and free of outliers. We’ll start by understanding the concept of a linearly separable dataset and the idea of maximizing the margin. Then, we’ll carefully derive the equivalent optimization problem, discuss the critical role of support vectors, and implement hard-margin SVM using cvxpy to visualize the decision boundary and the support vectors.

Linearly separable case

Given a binary classification dataset D={(x1,y1),(x2,y2),,(xn,yn)}D=\{(\bold x_1, y_1), (\bold x_2, y_2), \dots,(\bold x_n, y_n)\}, where xiRd\bold x_i \in \R^d and yi{1,1}y_i \in \{-1, 1\} ...