Gradient Descent: The Stochastic Update
Learn to code the perceptron trick using the Stochastic Gradient Descent.
Exploratory data analysis
We have two features X1 and X2 and a label. Each feature has ten data points and the label is 0 or 1. Make a decision boundary that separates the two classes.
Letβs look at the
π Note: Remember, we can only apply the perceptron algorithm if the data is linearly separable. For this, we need to draw the points on the graph to visualize the data.
Draw the data points on the
Get hands-on with 1200+ tech skills courses.