Search⌘ K
AI Features

Dropout

Explore the concept of dropout in convolutional neural networks to reduce co-adaptation among neurons. Learn how applying dropout randomly disables neurons during training, helping prevent overfitting and improving model generalization. Gain practical experience implementing dropout in a dense layer using TensorFlow to enhance CNN performance.

Chapter Goals:

  • Understand why we use dropout in neural networks
  • Apply dropout to a fully-connected layer

A. Co-adaptation

Co-adaptation refers to when multiple neurons in a layer extract the same, or very similar, hidden features from the input data. This can happen when the connection weights for two different neurons are nearly identical.

An example of co-adaptation between neurons A and B. Due to identical weights, A and B will pass the same value into C.
An example of co-adaptation between neurons A and B. Due to identical weights, A and B will pass the same value into C.

When a fully-connected layer has a large number of neurons, co-adaptation is more likely to occur. This can be a problem for two reasons. First, it is a waste of computation when ...