A Rectified Linear Unit(ReLU) is a
non-linearactivation function that performs on multi-layer neural networks. (e.g., f(x) = max(0,x) where x = input value).
In this layer we remove every negative value from the filtered image and replace it with zero. This function only activates when the node input is above a certain quantity. So, when the input is below zero the output is zero.
However, when the input rises above a certain
threshold it has linear relationship with the dependent variable. This means that it is able to accelerate the speed of a training data set in a deep neural network that is faster than other activation functions – this is done to avoid summing up with zero.
View all Courses