Skip to main content
BETAThis is a new design — give feedback

Context Window

Also known as: Context, Context Length, Window Size

In one sentence

The maximum amount of text an AI model can process at once—including both what you send and what it generates. Once the window fills up, the AI loses access to earlier parts of the conversation.

Explain like I'm 12

Imagine reading a book through a magnifying glass that only shows one page at a time. If the conversation gets longer than one page, the AI forgets the beginning. A bigger context window means a bigger magnifying glass that can see more pages at once.

In context

Context windows are measured in tokens (roughly words). GPT-3.5 had a 4,000-token window (about 3,000 words), while GPT-4 offers up to 128,000 tokens. Claude supports up to 200,000 tokens—enough to process an entire novel. Larger context windows let you paste in long documents for analysis, maintain longer conversations without the AI forgetting earlier messages, and provide more detailed instructions. However, larger windows use more computing power and cost more per request.

See also

Related Guides

Learn more about Context Window in these guides: