How to use few-shot examples in prompts
Quick answer
Use
few-shot examples by including a few input-output pairs in your prompt before the new query to guide the model's behavior. This technique helps models like gpt-4o understand the task format and produce more accurate responses.PREREQUISITES
Python 3.8+OpenAI API key (free tier works)pip install openai>=1.0
Setup
Install the OpenAI Python SDK and set your API key as an environment variable.
pip install openai>=1.0 Step by step
Include a few input-output examples in the prompt to demonstrate the task, then add the new input for the model to complete.
import os
from openai import OpenAI
client = OpenAI(api_key=os.environ["OPENAI_API_KEY"])
few_shot_prompt = '''
Translate English to French:
English: Hello, how are you?
French: Bonjour, comment ça va ?
English: What is your name?
French: Comment vous appelez-vous ?
English: Where is the library?
French:'''
response = client.chat.completions.create(
model="gpt-4o",
messages=[{"role": "user", "content": few_shot_prompt}]
)
print(response.choices[0].message.content.strip()) output
Où est la bibliothèque ?
Common variations
You can use few-shot prompting with different models like claude-3-5-haiku-20241022 or asynchronously. Adjust examples to fit your task and model capabilities.
import anthropic
import os
client = anthropic.Anthropic(api_key=os.environ["ANTHROPIC_API_KEY"])
few_shot_prompt = '''Translate English to French:
English: Hello, how are you?
French: Bonjour, comment ça va ?
English: What is your name?
French: Comment vous appelez-vous ?
English: Where is the library?
French:'''
message = client.messages.create(
model="claude-3-5-haiku-20241022",
max_tokens=100,
system="You are a helpful assistant.",
messages=[{"role": "user", "content": few_shot_prompt}]
)
print(message.content[0].text.strip()) output
Où est la bibliothèque ?
Troubleshooting
If the model output is off-topic or incomplete, ensure your few-shot examples are clear, consistent, and closely related to the task. Also, keep the prompt concise to avoid token limits.
Key Takeaways
- Include clear input-output pairs before your query to guide the model with few-shot examples.
- Use the latest SDK patterns and models like
gpt-4oorclaude-3-5-haiku-20241022for best results. - Keep examples relevant and concise to avoid exceeding token limits and confusing the model.