Tokens

Learn about tokens in GPT-3, including their limits and pricing, as well as how to monitor token consumption in the Playground.

Understanding tokens

Tokens are numerical representations of words or characters. Using tokens as a standard measure, GPT-3 can handle training prompts from a few words to entire documents.

For standard English text, 1 token consists of approximately 4 characters. It translates to roughly 3/4 of a word, so for 100 tokens, there will be approximately 75 words. As a point of reference, the collected works of Shakespeare consist of about 900,000 words, roughly translating to 1.2 million tokens.

To maintain the latency of API calls, OpenAI imposes a limit of 2,048 tokens (approximately 1,500 words) for prompts and completions.

Get hands-on with 1200+ tech skills courses.