Semantic Search with Transformers
In this project, we’ll use the sentence-transformers library to perform semantic search in a corpus of machine learning research papers. sentence-transformers allows us to use Transformer models that have been fine-tuned to give semantically meaningful embeddings for natural language. Transformer-based models are known to form high-level linguistic and semantic representations when used for natural languages. As we’ll see in the Experiments section, semantic search can retrieve articles based on synonyms and similar contexts, even without the exact occurrence of the searched words. Most search engines are powered by Transformer-based models, which is the current state of the art and is quickly replacing lexical search.
We’ll encode the dataset using sentence-transformers and create an index for k-nearest-neighbors search using Facebook’s Faiss library. We will then perform a few experiments, using summaries from the dataset and text inputs to search the database for similar articles.