Claude Enterprise extended context window
Quick answer
Use the
claude-enterprise model with the context_window parameter set to the desired token length to enable Claude Enterprise's extended context window. This allows processing of larger inputs and maintaining longer conversations seamlessly within the Anthropic Enterprise API.PREREQUISITES
Python 3.8+Anthropic Enterprise API keypip install anthropic>=0.20
Setup
Install the anthropic Python SDK and set your Anthropic Enterprise API key as an environment variable.
- Install SDK:
pip install anthropic - Set environment variable:
export ANTHROPIC_API_KEY='your_enterprise_api_key'
pip install anthropic output
Collecting anthropic Downloading anthropic-0.20.0-py3-none-any.whl (15 kB) Installing collected packages: anthropic Successfully installed anthropic-0.20.0
Step by step
Use the Anthropic client with the claude-enterprise model and specify the context_window parameter to extend the context length. This example shows how to send a prompt with a 100k token context window.
import os
import anthropic
client = anthropic.Anthropic(api_key=os.environ["ANTHROPIC_API_KEY"])
response = client.messages.create(
model="claude-enterprise",
max_tokens=1024,
context_window=100000,
system="You are a helpful assistant.",
messages=[{"role": "user", "content": "Explain the benefits of an extended context window."}]
)
print(response.content[0].text) output
The extended context window in Claude Enterprise allows processing of much larger inputs, enabling long documents, codebases, or conversations to be handled in a single session without losing context. This improves coherence and reduces the need for manual context management.
Common variations
You can adjust the context_window parameter to other supported sizes depending on your subscription and use case. The claude-enterprise model supports up to 100k tokens as of 2026-04.
For asynchronous usage, use async methods in the anthropic SDK. Streaming is not currently supported for extended context windows.
import asyncio
import os
import anthropic
async def main():
client = anthropic.Anthropic(api_key=os.environ["ANTHROPIC_API_KEY"])
response = await client.messages.acreate(
model="claude-enterprise",
max_tokens=1024,
context_window=100000,
system="You are a helpful assistant.",
messages=[{"role": "user", "content": "Summarize a long document."}]
)
print(response.content[0].text)
asyncio.run(main()) output
The extended context window enables Claude Enterprise to summarize very long documents efficiently by maintaining full context, which improves accuracy and relevance in the summary.
Troubleshooting
- If you receive an error about unsupported context window size: Verify your subscription supports extended context windows and use a valid
context_windowvalue (e.g., 100000 tokens). - If responses are truncated: Ensure
max_tokensis set appropriately and does not exceed thecontext_windowlimit. - If latency is high: Large context windows increase processing time; optimize prompt length and token usage accordingly.
Key Takeaways
- Use the
context_windowparameter withclaude-enterpriseto enable extended context up to 100k tokens. - Set
max_tokenscarefully to avoid truncation within the extended context. - Extended context windows improve handling of long documents and conversations without losing coherence.
- Check your Anthropic Enterprise subscription supports extended context windows before usage.
- Use asynchronous calls for scalable integration but streaming is not supported for extended context.