Summary: Advanced Word Vector Algorithms

Review what we've learned in this chapter.

GloVe embeddings

We discussed GloVe—another word embedding learning technique. GloVe takes the current Word2vec algorithms a step further by incorporating global statistics into the optimization, increasing the performance.

ELMo embeddings

Next, we learned about a much more advanced algorithm known as ELMo. ELMo provides contextualized representations of words by looking at a word within a sentence or a phrase, not by itself.

Real-world application

Finally, we discussed a real-world application of using word embeddings—document classification. We showed that word embeddings are very powerful and allow us to classify related documents with a simple multiclass logistic regression model reasonably well. ELMo performed the best out of skip-gram, CBOW, and GloVe due to the vast amount of data it has been trained on.

Get hands-on with 1200+ tech skills courses.