Gradient Descent: Logistic Regression

Learn about the gradient descent algorithm and its application in logistic regression.

Logistic regression

Consider the scenario where we want to create a model that predicts whether an individual has diabetes or not. This is a case of binary classification and let’s assume that the input matrix X=[x1Tx2T...xNT]RN×dX = \begin{bmatrix} x_1^T \\ x_2^T \\ . \\. \\ . \\ x_N^T \end{bmatrix} \in \R^{N \times d} represents the collection of the dd-dimensional input features, such as age, weight, height, cholesterol, etc., for NN patients. Y=[y1y2...yN]{0,1}NY = \begin{bmatrix} y_1 \\ y_2 \\ . \\ . \\ . \\ y_N \end{bmatrix} \in \{0,1\}^{N} denotes their corresponding binary labels (one means diabetes, and zero means no diabetes). The prediction Y^\hat{Y} of a logistic regression model is given as follows:

Get hands-on with 1400+ tech skills courses.