Context Window
Core ConceptsThe maximum amount of text (measured in tokens) that an AI model can 'see' and process in a single interaction.
Full Explanation
Think of the context window as the AI's working memory. Everything inside the context window — your conversation history, instructions, documents you've shared — is available to the model. Text outside the window is forgotten. Larger context windows allow processing entire books, codebases, or lengthy documents in one prompt. Claude has 200K tokens, Gemini up to 2M.
Claude's 200K context window can hold roughly 150,000 words — about the length of a full novel.
Related Terms
The basic unit of text that AI language models process — roughly equivalent to 3/4 of a word in English.
A type of AI model trained on vast amounts of text data that can generate, summarize, translate, and reason about language.
A technique that enhances LLM responses by retrieving relevant documents from an external knowledge base before generating an answer.