# Logistic Regression Predictions Using Sigmoid

Learn how the logistic regression coefficients are utilized for the predictions.

## We'll cover the following

## From Logistic Regression coefficients to predictions using sigmoid

Before the next exercise, letâ€™s take a look at how the coefficients for logistic regression are used to calculate predicted probabilities, and ultimately make predictions for the class of the response variable.

Recall that logistic regression predicts the probability of class membership, according to the sigmoid equation. In the case of two features with an intercept, the equation is as follows:

$p = \frac{1}{1+e^{-(Î¸_0 + Î¸_1X_1 + Î¸_2X_2)}}$

When you call the `fit`

method of a logistic regression model object in scikit-learn using the training data, the $Î¸_0$, $Î¸_1$, and $Î¸_2$ parameters (intercept and coefficients) are estimated from this labeled training data. Effectively, scikit-learn figures out how to choose values for $Î¸_0$, $Î¸_1$, and $Î¸_2$, so that it will classify as many training data points correctly as possible. Weâ€™ll gain some insight into how this process works in the next chapter.

When you call `predict`

, scikit-learn calculates predicted probabilities according to the fitted parameter values and the sigmoid equation. A given sample will then be classified as positive if $p â‰¥ 0.5$, and negative otherwise.

We know that the plot of the sigmoid equation looks like the following, which we can connect to the equation above by making the substitution $X = Î¸_0 + Î¸_1X_1 + Î¸_2X_2$:

Get hands-on with 1400+ tech skills courses.