Search⌘ K

Sigmoid and Logistic Regression

Explore how to transform logits into probabilities with sigmoid functions and implement logistic regression for binary classification in PyTorch. Understand the connection between log odds, sigmoid activation, and model parameters, enabling you to build and evaluate simple neural network models.

From logits to probabilities

We are trying to map logit values into probabilities, and we have found, graphically, a function that maps log odds ratios into probabilities.

Clearly, our logits are log odds ratios. Sure, concluding this is not very scientific, but the purpose of this exercise is to illustrate how the results of a regression, represented by the logits (zz), get to be mapped into probabilities.

So, that is where we arrived at:

b+w1x1+w2x2=z =log(p1p)b + w_1 x_1 + w_2 x_2 = z \space= log(\dfrac{p}{1 - p})

eb + w1x1+ w2x2=ez=p1pe^{b \space + \space w_1 x_1 + \space w_2 x_2} = e^z = \dfrac{p}{1 - p}

Let us work this equation out a bit, inverting, rearranging, and simplifying some terms to isolate pp:

1ez=1  pp\dfrac{1}{e^z} = \dfrac{1 \space - \space p}{p} ...