Context Window
Also known as: Context, Context Length, Window Size
In one sentence
The maximum amount of text an AI model can process at once—including both what you send and what it generates. Once the window fills up, the AI loses access to earlier parts of the conversation.
Explain like I'm 12
Imagine reading a book through a magnifying glass that only shows one page at a time. If the conversation gets longer than one page, the AI forgets the beginning. A bigger context window means a bigger magnifying glass that can see more pages at once.
In context
Context windows are measured in tokens (roughly words). GPT-3.5 had a 4,000-token window (about 3,000 words), while GPT-4 offers up to 128,000 tokens. Claude supports up to 200,000 tokens—enough to process an entire novel. Larger context windows let you paste in long documents for analysis, maintain longer conversations without the AI forgetting earlier messages, and provide more detailed instructions. However, larger windows use more computing power and cost more per request.
See also
Related Guides
Learn more about Context Window in these guides:
Context Management: Handling Long Conversations and Documents
IntermediateMaster context window management for AI. Learn strategies for long conversations, document processing, memory systems, and context optimization.
12 min readContext Windows: How Much AI Can Remember
IntermediateContext windows determine how much text an AI can process at once. Learn how they work, their limits, and how to work within them.
8 min readContext Engineering: Beyond Prompt Engineering
IntermediateThe 2026 paradigm shift from crafting prompts to engineering entire context windows. Learn to design the informational environment that makes AI systems reliable.
12 min read