Search⌘ K
AI Features

Visualizing Attention Patterns

Explore how to visualize attention patterns in sequence-to-sequence neural machine translation models. Learn to extract and interpret attention matrices, understand the alignment between input and predicted words, and evaluate model behavior through heat maps of attention weights.

Remember that we specifically defined a model called attention_visualizer to generate attention matrices? With the model trained, we can now look at these attention patterns by feeding data to the model. Here’s how the model was defined: ...