# Sigmoid and Logistic Regression

Learn how sigmoid plays a link in mapping logit values into probabilities and its role in logistic regression.

## From logits to probabilities

We are trying to map logit values into probabilities, and we have found, graphically, a function that maps log odds ratios into probabilities.

Clearly, our logits are log odds ratios. Sure, concluding this is not very scientific, but the purpose of this exercise is to illustrate how the results of a regression, represented by the logits ($z$), get to be mapped into probabilities.

So, that is where we arrived at:

$b + w_1 x_1 + w_2 x_2 = z \space= log(\dfrac{p}{1 - p})$

$e^{b \space + \space w_1 x_1 + \space w_2 x_2} = e^z = \dfrac{p}{1 - p}$

Let us work this equation out a bit, inverting, rearranging, and simplifying some terms to isolate $p$:

$\dfrac{1}{e^z} = \dfrac{1 \space - \space p}{p}$

$e^{-z} = \dfrac{1}{p} - 1$

$1 + e^{-z} = \dfrac{1}{p}$

$p = \dfrac{1}{1 + e^{-z}}$

Does it look familiar? That is a **sigmoid function**! It is the inverse of the log odds ratio, which has the following equation:

$p = \sigma{(z)} = \dfrac{1}{1 + e^{-z}}$

The code for this is given below as well:

Get hands-on with 1200+ tech skills courses.