# Activation Functions

Learn about the most popular activation functions for deep learning.

## We'll cover the following

Activation functions are non-linear functions that determine the outputs of neurons. As we already discussed, each neuron accepts a set of inputs, multiplies them by the weights, sums them up, and adds a bias.

$z= w_1*x_1 +w_2*x_2 + w_3*x+ bo$

Because that will result in a linear transformation, we then pass neurons through a non-linear function $f$ so we can capture non-linear patterns between our data.

$a= f( w_1*x_1 +w_2*x_2 + w_3*x+3 + bo)$

Over the years, many functions have been proposed, each one with its strengths and weaknesses. In this lesson, we will discuss the most common ones.

## Sigmoid

$f(x) = \frac{1}{1+e^{-x}}$

Get hands-on with 1200+ tech skills courses.