Search⌘ K

Decision Boundary

Explore how decision boundaries work in logistic regression models by analyzing the linear separation of classes using PyTorch. Understand the role of model weights in defining the boundary, and see how increasing feature dimensions can improve class separability through practical examples and visualizations.

Decision boundary for logistic regression

We have just figured out that whenever z equals zero, we are in the decision boundary. But z is given by a linear combination of features x1 and x2. If we work out some basic operations, we arrive at:

z = 0 = b + w1x1+ w2x2z \space = \space 0 \space = \space b \space + \space w_1x_1 + \space w_2x_2

w2x2 = b + w1x1-w_2x_2 \space = \space b \space + \space w_1x_1 ...