Search⌘ K

Multilingual Sentence-BERT Model

Explore how to apply pretrained multilingual Sentence-BERT models using the sentence-transformers library. Learn to compute similarity between sentences in different languages and understand how to fine-tune these models for various NLP applications.

We learned how to make the monolingual model multilingual through knowledge distillation. Now let's learn how to use the pre-trained multilingual model. The researchers have made their pre-trained models publicly available with the sentence-transformers library. So, we can directly download a pre-trained model and use it for our task.

Pre-trained multilingual models

...