Transformers and Transfer Learning
Explore the transformer architecture introduced in 2017 and how it revolutionizes NLP by using self-attention mechanisms. Understand the differences between transformers and LSTM, and discover how transfer learning with pre-trained models like BERT enhances NLP applications in spaCy.
We'll cover the following...
A milestone in NLP happened in 2017 with the release of the research paper