Transformers and TensorFlow

Let's do a deep dive into transformers code with TensorFlow.

In this lesson, we'll dive into transformers code with TensorFlow. Pre-trained transformer models are provided to the developer community as open source by many organizations, including Google, Facebook, and HuggingFace. All the listed organizations offer pre-trained models and nice interfaces to integrate transformers into our Python code. The interfaces are compatible with either PyTorch or Tensorflow or both.

We'll be using HuggingFace's pre-trained transformers and their TensorFlow interface to the transformer models. HuggingFace is an AI company with a focus on NLP and quite devoted to open source. We'll take a closer look at what is available in HuggingFace Transformers.

HuggingFace Transformers

Let's discover HuggingFace's pre-trained models, the TensorFlow interface for using these models, and HuggingFace model conventions in general. HuggingFace offers different sorts of models. Each model is dedicated to a task such as text classification, question answering, and sequence-to-sequence modeling.

The following diagram is taken from the HuggingFace documentation and shows details of the distilbert-base-uncased-distilled-squad model. In the documentation, first, the task is tagged (upper-left corner of the diagram; the Question Answering tag), followed by supported deep learning libraries (PyTorch, TensorFlow, TFLite, TFSavedModel for this model), the dataset it trained on (squad, in this instance), the model language (en for English), and the license and base model's name (DistilBERT in this case).

Some models are trained with similar algorithms and so belong to the same model family. By way of an example, the DistilBERT family includes many models, such as distilbert-base-uncased and distilbert-multilingual-cased. Each model name also includes some information, such as casing (the model recognizes uppercase/lowercase differences) or the model language, such as en, de, or multilingual:

Get hands-on with 1200+ tech skills courses.