How to use Claude for text summarization in python
Direct answer
Use the
anthropic Python SDK to call client.messages.create with the claude-3-5-sonnet-20241022 model, passing your text as a user message and a system prompt to summarize it.Setup
Install
pip install anthropic Env vars
ANTHROPIC_API_KEY Imports
import os
import anthropic Examples
inSummarize: "Artificial intelligence is transforming industries by automating tasks and enabling new capabilities."
outAI is revolutionizing industries through automation and new capabilities.
inSummarize: "The quick brown fox jumps over the lazy dog multiple times to demonstrate agility."
outA quick fox shows agility by repeatedly jumping over a lazy dog.
inSummarize: "In 2026, climate change remains a critical global challenge requiring urgent action from all nations."
outClimate change is a pressing global issue needing urgent worldwide action in 2026.
Integration steps
- Install the Anthropic Python SDK and set the ANTHROPIC_API_KEY environment variable.
- Import the anthropic library and initialize the client with the API key from os.environ.
- Create a system prompt instructing Claude to summarize the given text.
- Send the user message containing the text to summarize using client.messages.create with the Claude model.
- Extract the summarized text from the response's content field.
- Print or use the summarized output as needed.
Full code
import os
import anthropic
client = anthropic.Anthropic(api_key=os.environ["ANTHROPIC_API_KEY"])
text_to_summarize = (
"Artificial intelligence is transforming industries by automating tasks and enabling new capabilities."
)
system_prompt = "You are a helpful assistant that summarizes text concisely."
response = client.messages.create(
model="claude-3-5-sonnet-20241022",
max_tokens=200,
system=system_prompt,
messages=[{"role": "user", "content": f"Summarize the following text:\n{text_to_summarize}"}]
)
summary = response.content[0].text
print("Summary:", summary) output
Summary: AI is transforming industries by automating tasks and enabling new capabilities.
API trace
Request
{"model": "claude-3-5-sonnet-20241022", "max_tokens": 200, "system": "You are a helpful assistant that summarizes text concisely.", "messages": [{"role": "user", "content": "Summarize the following text:\nArtificial intelligence is transforming industries by automating tasks and enabling new capabilities."}]} Response
{"id": "chatcmpl-xxx", "object": "chat.completion", "created": 1680000000, "model": "claude-3-5-sonnet-20241022", "choices": [{"index": 0, "message": {"role": "assistant", "content": ["AI is transforming industries by automating tasks and enabling new capabilities."]}, "finish_reason": "stop"}], "usage": {"prompt_tokens": 50, "completion_tokens": 20, "total_tokens": 70}} Extract
response.content[0].textVariants
Streaming Summarization ›
Use streaming to display the summary progressively for better user experience with longer texts.
import os
import anthropic
client = anthropic.Anthropic(api_key=os.environ["ANTHROPIC_API_KEY"])
text_to_summarize = "Artificial intelligence is transforming industries by automating tasks and enabling new capabilities."
system_prompt = "You are a helpful assistant that summarizes text concisely."
stream = client.messages.stream(
model="claude-3-5-sonnet-20241022",
max_tokens=200,
system=system_prompt,
messages=[{"role": "user", "content": f"Summarize the following text:\n{text_to_summarize}"}]
)
summary = ""
for chunk in stream:
summary += chunk.content[0].text
print("Summary:", summary) Async Summarization ›
Use async calls to handle multiple summarization requests concurrently in an async application.
import os
import asyncio
import anthropic
async def summarize_text():
client = anthropic.Anthropic(api_key=os.environ["ANTHROPIC_API_KEY"])
text_to_summarize = "Artificial intelligence is transforming industries by automating tasks and enabling new capabilities."
system_prompt = "You are a helpful assistant that summarizes text concisely."
response = await client.messages.acreate(
model="claude-3-5-sonnet-20241022",
max_tokens=200,
system=system_prompt,
messages=[{"role": "user", "content": f"Summarize the following text:\n{text_to_summarize}"}]
)
print("Summary:", response.content[0].text)
asyncio.run(summarize_text()) Alternative Model - Claude 3 Opus ›
Use the Claude 3 Opus model for a balance of speed and cost when high creativity is less critical.
import os
import anthropic
client = anthropic.Anthropic(api_key=os.environ["ANTHROPIC_API_KEY"])
text_to_summarize = "Artificial intelligence is transforming industries by automating tasks and enabling new capabilities."
system_prompt = "You are a helpful assistant that summarizes text concisely."
response = client.messages.create(
model="claude-3-opus-20240229",
max_tokens=200,
system=system_prompt,
messages=[{"role": "user", "content": f"Summarize the following text:\n{text_to_summarize}"}]
)
print("Summary:", response.content[0].text) Performance
Latency~1.2 seconds for a 200-token summary on claude-3-5-sonnet-20241022
Cost~$0.003 per 200 tokens summarized
Rate limitsTier 1: 300 requests per minute, 20,000 tokens per minute
- Keep your input text concise to reduce token usage.
- Use max_tokens parameter to limit output length.
- Batch multiple texts if possible to optimize calls.
| Approach | Latency | Cost/call | Best for |
|---|---|---|---|
| Standard call | ~1.2s | ~$0.003 | Simple summarization tasks |
| Streaming | ~1.2s (progressive) | ~$0.003 | Long texts with better UX |
| Async | ~1.2s (concurrent) | ~$0.003 | High throughput applications |
Quick tip
Always provide a clear system prompt instructing Claude to summarize concisely to get focused summaries.
Common mistake
Beginners often forget to use the system= parameter and instead try to put system instructions inside the messages array, which Claude's SDK does not support.