Search⌘ K
AI Features

Padding

Understand how to apply padding to tokenized text sequences of varying lengths to create fixed-size inputs for LSTM language models. Learn the importance of padding tokens and how to prepare input and target sequences for training batches in NLP tasks.

Chapter Goals:

  • Learn about sequence lengths and padding

A. Varied length sequence

When dealing with most neural networks, the input data always has a fixed length. This is because most neural networks have what's known as a feed-forward structure, meaning that they utilize multiple layers of fixed sizes to compute the network's output.

However, since text ...