Multiple Layers
Explore the process of stacking multiple LSTM cell layers in recurrent neural networks to enhance the ability to capture complex sequence features. Learn how to construct multi-layer LSTMs in TensorFlow, apply dropout for regularization, and understand their impact on language modeling tasks. This lesson guides you through implementing a stacked LSTM model for improved NLP outcomes.
We'll cover the following...
Chapter Goals:
Learn how to stack multiple cell layers in an RNN
A. Stacking layers
Similar to how we can stack hidden layers in an MLP, we can also stack cell layers in an RNN. Adding cell layers allows the model to pick up on more complex features from the input sequence and therefore improve performance when trained on a large enough dataset.
In the diagram above, the RNN contains 2 cell layers (which is just two cells). At each time step, the first cell's output becomes the input for the ...