# Calculate Neural Network Output

Learn how output is calculated in neural networks.

## We'll cover the following

## Calculations in the layers

The first layer of nodes is the input layer, and it doesn’t do anything other than represent the input signals. That is, the input nodes don’t apply an activation function to the input. There isn’t really a reason for this other than that’s how it has always been done. The first layer of neural networks is the input layer and all that layer does is represent the inputs.

So the first input for layer 1 was easy—there aren’t any calculations to be done there.

Next is the second layer, where we do need to do some calculations. For each node in this layer, we need to work out the combined input. Remember the sigmoid function:

$y = \frac{1}{1 + e^{-x}}$

The $x$ in that function is the combined input into a node. In the previous layer, that combination was the raw outputs from the connected nodes, but moderated by the link weights. The following diagram is like the one we saw previously but now includes the need to moderate the incoming signals with the link weights.

Get hands-on with 1200+ tech skills courses.