Initialize the Weights

Discover why it is important to initialize weights and avoid dead neurons while building the neural network.

In Part IFrom “How Machine Learning Works” to “The Perceptron” of this course, weight initialization was a quick job, we set all the weights to 0. By contrast, weight initialization in a neural network comes with a hard-to-spot pitfall. Let’s discuss that pitfall, and see how to handle it.

Fearful symmetry

Here is one rule to keep in mind: never initialize all the weights in a neural network with the same value. The reason for that recommendation is subtle, and comes from the matrix multiplications in the network. For example, let’s look at this matrix multiplication below:

Get hands-on with 1200+ tech skills courses.