Visualizing Attention Patterns
Learn to visualize attention patterns to gain deeper insights into how they work.
Remember that we specifically defined a model called attention_visualizer to generate attention matrices? With the model trained, we can now look at these attention patterns by feeding data to the model. Here’s how the model was defined:
attention_visualizer = tf.keras.models.Model(inputs=[encoder.inputs,decoder_input], outputs=[attn_weights, decoder_out])
The get_attention_matrix_for_sampled_data() function
We’ll also define a function to get the processed attention matrix along with label data that we can use directly for visualization purposes:
...