Skipgram embedding is a word embedding technique that relies on unsupervised learning and is used to predict related context words of a given target word.
Skipgram embedding is commonly used to train
Continuous bag-of-words (CBOW) is another similar word embedding representation. It is the reverse of skipgram – it predicts the current word from a window of surrounding context words. However, unlike skipgram, the order of words does not matter in CBOW.
Let’s consider the above example. Given a sentence, we create pairs of context words and the target word depending on the window size. Window size is 2 in our case, which means that we consider two context words before and after the target word.
We then input these context-target pairs to a skip-gram word2vec model for training. After training, the skip-gram model can predict surrounding context words given a target word.
Free Resources