Combined Refinements

Let's combine all the refinements we learned in the previous lessons.

We'll cover the following

Let’s combine all these refinements, BCE loss, leaky ReLU activation, Adam optimiser, and layer normalization.

Be wise in combining refinements!

Because the BCE loss can’t accept values outside the range 0 to 1, which can emerge from a leaky ReLU, we can have a sigmoid after the last layer, but keep a LeakyRELU after the hidden layer.

Get hands-on with 1200+ tech skills courses.