This device is not compatible.

Text Summarization with Transformers

PROJECT


Text Summarization with Transformers

Learn how to generate summaries of news reports using the BART transformer model.

Text Summarization with Transformers

You will learn to:

Fine-tune BART for text summarization using PyTorch.

Load and save models using the Hugging Face Hub API.

Monitor training progress using the Weights & Biases library.

Evaluate the model using ROUGE.

Skills

Deep Learning

Natural Language Processing

Machine Learning

Prerequisites

Intermediate programing skills in Python

Intermediate knowledge of deep learning

Basic understanding of Transformer-based models

Technologies

Python

PyTorch

Hugging Face

Project Description

Recently transformer-based models have become popular due to their ability to understand and manipulate natural languages. These models manipulate languages by forming high-level linguistic and semantic representations. These representations have been created through unsupervised pre-training on a large text dataset by performing masked language modeling (MLM).

In this project, we’ll use the BART model by Facebook to summarize news articles. In addition to the MLM task, BART is also trained on the next-sentence prediction (NSP) task, which it performs with an autoregressive decoder. We’ll load the model, fine-tune it on a summarization dataset, and finally evaluate it using the ROUGE score.

We’ll use the Hugging Face Hub API for access to the models, the PyTorch library for implementing the deep learning logic, the Weights & Biases library to visualize training, and the Evaluate library to evaluate the model.

Project Tasks

1

Getting Started

Task 0: Introduction

Task 1: Import the Libraries

Task 2: Log in to the APIs

2

Load the Data

Task 3: Initialize the Parameters

Task 4: Read the Dataset

Task 5: Create the Data Loading Script

Task 6: Create the Data Loaders

3

Train the Model

Task 7: Get the Model from Hugging Face

Task 8: Create the Training Function

Task 9: Train the Model

4

Evaluate the Model

Task 10: Create the Evaluation Function

Task 11: Run Evaluation

Task 12: Compute Evaluation Metrics

Congratulations!