Search⌘ K

Review of the Deep Network

Explore the behavior of deep neural networks, focusing on the challenges of overfitting and underfitting. Learn how to identify these issues by examining decision boundaries and loss trends, and understand the importance of balancing model complexity to improve accuracy on new data. This lesson sets the foundation for applying regularization techniques to enhance machine learning models.

The general behavior of neural networks

Let’s get back to our deep network and the Echidna dataset. We have learned three important concepts so far:

  • Powerful neural networks tend to overfit.
  • Simple neural networks tend to underfit.
  • We should strike a balance between the two.

A general strategy to strike that balance is to start with an overfitting model function that tracks tiny fluctuations in the data and progressively makes it smoother until we hit a good middle ground. That idea of smoothing out the model function is called regularization and it is the subject of this lesson.

In the previous chapter, we took the first step of the process: we created a deep neural network that overfits the data at hand. Let’s take a closer look at that network’s model, and afterward, we’ll see how to make it smoother.

To gain more insight into overfitting, a few changes are made to our deep neural network ...