What is a token?

Tokens are pieces of words used for natural language processing.

For English text, 1 token is approximately 4 characters or 0.75 words.

As a point of reference, the collected works of Shakespeare are about 900,000 words or 1.2M tokens.

To learn more about how tokens work and estimate your usage you can visit OpenAI Tokenizer tool.

Where should I go next?