Issues in Deep Learning
Understand common issues in deep learning such as overfitting and data scarcity, and learn practical solutions including L2 regularization, dropout, data augmentation, normalization, mini-batch gradient descent, learning rate scheduling, and batch normalization to build more robust and efficient neural networks.
Regularization
We can train the model on the data and reduce the bias error. However, as we do more and more training with the dense network, the test error starts increasing or overfitting emerges in our model. We can handle this situation using regularization.
L2 Regularization
This is a technique that gives a penalty to the higher weight parameters to prevent overfitting. In logistic regression, we minimize the loss function in each iteration. The minimization equation is given below.
Minimization equation:
...