Skip to main content
BETAThis is a new design — give feedback

Token

Also known as: Tokens, Tokenization

In one sentence

A chunk of text — usually a word or part of a word — that AI models process as a single unit. Most English words are one token, but longer or uncommon words get split into pieces.

Explain like I'm 12

AI reads by breaking sentences into little puzzle pieces called tokens. Common words like 'the' stay whole, but big words like 'unbelievable' get split into smaller bits — kind of like how you sound out a long word syllable by syllable.

In context

Tokens are the currency of AI. When you use ChatGPT or Claude, you're charged per token for both your input and the model's output. A typical English word averages about 1.3 tokens. The sentence 'Hello, how are you?' is roughly 6 tokens. Token limits determine how much text a model can process at once — GPT-4o handles 128,000 tokens (about 96,000 words), while Claude can handle 200,000 tokens. Understanding tokens helps you estimate costs and stay within context limits.

See also

Related Guides

Learn more about Token in these guides: