Claude Context Window Explained: From 200K to 1M Tokens

Abstract representation of Anthropic Claude model capabilities

Claude AI · Fitted Claude
Updated April 2026: Claude Sonnet 4.6 and Opus 4.6 now support a 1 million token context window at standard pricing. Haiku 4.5 supports 200,000 tokens. The information below has been updated to reflect current specs.

Claude’s context window is one of its most practically important technical specifications — and one of the least well understood. This guide explains tokens and context windows, how Claude’s compare to competitors, and strategies for working effectively within context limits.

What Is a Context Window?

A context window is the total amount of text a model can process in a single session — everything it can “see” and reason about at once. Context is measured in tokens. As a practical rule: 1,000 tokens ≈ 750 words.

Claude’s Context Windows

Access Method Context Window Approx. Words
Standard Claude (all plans) 1,000,000 tokens (Sonnet/Opus), 200,000 (Haiku) ~750,000 words (Sonnet/Opus)
Enterprise Claude 500,000 tokens ~375,000 words
Claude Code 1,000,000 tokens ~750,000 words

What Fits in 200K Tokens?

  • A full-length novel (~100,000 words)
  • 100-200 typical business emails
  • 10-15 long research papers
  • An entire small codebase (5,000-10,000 lines)
  • A year’s worth of meeting notes from a small team

PDF and Document Token Costs

  • PDFs: 1,500-3,000 tokens per page
  • Plain text: ~1 token per 4 characters
  • Images: 1,000-4,000 tokens per image
  • Code files: 500-2,000 tokens per file

Strategies for Long Contexts

  • Extract before uploading: Only upload relevant PDF sections, not full documents
  • Use Projects for reference material: Store knowledge base docs in Projects rather than re-uploading every session
  • Auto compaction (Claude Code beta): When coding sessions approach limits, Claude automatically summarizes history to continue

Frequently Asked Questions

How many pages can Claude read at once?

With 200K tokens and ~1,500-3,000 tokens per PDF page, roughly 65-130 pages while leaving room for conversation.

Does Claude forget things in long conversations?

Not within the context window. In very long conversations approaching the limit, older content may be truncated.


Need this set up for your team?
Talk to Will →

Comments

Leave a Reply

Your email address will not be published. Required fields are marked *