Assignments and Supplemental Reading Materials
Now that you have built a project and completed the quiz, you are ready to move to the next step: exploring some supplemental reading materials and completing the provided assignments to gain a better understanding of the topics we discussed.
Supplemental reading materials
An Introduction to Hidden Markov Models by L. R. Rabiner and B. H. Juang, from Stanford University. This paper gives an introduction to the theory of Markov models, and illustrates how they have been applied to problems in speech recognition.
Comparing the Performance of the LSTM and HMM Language Models viaStructural Similarity by Larkin Liu et al. in 2019. We suggest you take a look on this paper. It will help you understand the subtle differences between the two models, and you will learn which model suits what type of application.
Hidden Markov Chains, Entropic Forward-Backward,and Part-Of-Speech Tagging by E. Azeraf et al. in 2020. This paper discusses a new way to work with Markov chains in the context of a Part-Of-Speech Tagger. This will give you a new way of thinking, and you can then solve some of the essential problems in NLP by understanding the new techniques. You can even develop your own, and that’s going to be your research.