Token
Also known as: Tokens, Tokenization
In one sentence
A chunk of text — usually a word or part of a word — that AI models process as a single unit. Most English words are one token, but longer or uncommon words get split into pieces.
Explain like I'm 12
AI reads by breaking sentences into little puzzle pieces called tokens. Common words like 'the' stay whole, but big words like 'unbelievable' get split into smaller bits — kind of like how you sound out a long word syllable by syllable.
In context
Tokens are the currency of AI. When you use ChatGPT or Claude, you're charged per token for both your input and the model's output. A typical English word averages about 1.3 tokens. The sentence 'Hello, how are you?' is roughly 6 tokens. Token limits determine how much text a model can process at once — GPT-4o handles 128,000 tokens (about 96,000 words), while Claude can handle 200,000 tokens. Understanding tokens helps you estimate costs and stay within context limits.
See also
Related Guides
Learn more about Token in these guides:
Token Economics: Understanding AI Costs
IntermediateAI APIs charge per token. Learn how tokens work, how to estimate costs, and how to optimize spending.
6 min readContext Management: Handling Long Conversations and Documents
IntermediateMaster context window management for AI. Learn strategies for long conversations, document processing, memory systems, and context optimization.
12 min readContext Windows: How Much AI Can Remember
IntermediateContext windows determine how much text an AI can process at once. Learn how they work, their limits, and how to work within them.
8 min read