Quiz Yourself on Transformers

Take a multiple-choice quiz on attention and transformers.

This was a long chapter with lots of lessons. The reason is simple. Attention-based models and Transformers are the hottest architectures at the moment as they have managed to outperform LSTMs in NLP problems and are slowly getting there in computer vision as well. That is why it is vital that you understand them fully. That doesn’t mean that this is easy. The trick is that all the subcomponents borrow ideas and techniques from many other architectures.

Hopefully, though, you have accomplished your goal and now have a strong understanding of Transformers. You can take the following quiz to clarify any misconceptions.

Get hands-on with 1200+ tech skills courses.