Summary: Sequence-to-Sequence Learning—NMT
Review what we've learned in this chapter.
We'll cover the following
The machine translation problem
In this chapter, we talked in detail about NMT systems. MT is the task of translating a given text corpus from a source language to a target language. First, we briefly talked about the history of MT to build a sense of appreciation for what has gone into MT for it to become what it is today. We saw that today, the highest-performing MT systems are actually NMT systems. Next, we solved the NMT task of generating English to German translations. We talked about the dataset preprocessing that needs to be done and extracting important statistics about the data (e.g., sequence lengths). We then talked about the fundamental concept of these systems and decomposed the model into the embedding layer, the encoder, the context vector, and the decoder. We also introduced techniques like teacher forcing and Bahdanau attention, which are aimed at improving model performance. Then, we discussed how training and inference work in NMT systems. We also discussed a new metric called BLEU and how it’s used to measure performance on sequence-to-sequence problems like MT.
Get hands-on with 1200+ tech skills courses.