Overview: Generation of Discrete Sequences Using GANs
Get an overview of the topics covered in this chapter.
We'll cover the following
In this chapter, we will learn how to implement a model that was used in the paper “Adversarial Generation of Natural Language” by Rajeswar et al. This model was first described in the paper “Improved Training of Wasserstein GANs” by Gulrajani et al. It is capable of generating short discrete sequences with small vocabularies.
We will first address language generation as a problem of conditional probability, in which we estimate the probability of the next token given the previous tokens. Then, we will address the challenges involved in training models for discrete sequences using GANs.
After this introduction to language generation, we will learn how to implement the model described in the paper by Rajeswar et al. and train it on the Google One Billion Word dataset. We will train two separate models: one to generate sequences of characters and another to generate sequences of words.
Topics covered in the chapter
The following topics will be covered in this chapter:
Natural language generation with GANs
Experimental setup
Model implementation
Inference
Get hands-on with 1200+ tech skills courses.