How to beginner · 3 min read

How to use examples in prompts

Quick answer
Use examples in prompts by providing input-output pairs or demonstrations within the prompt text to guide the AI model's behavior. This technique, called few-shot prompting, improves response accuracy and relevance by showing the model exactly what you expect.

PREREQUISITES

  • Python 3.8+
  • OpenAI API key (free tier works)
  • pip install openai>=1.0

Setup

Install the openai Python package and set your API key as an environment variable for secure access.

bash
pip install openai>=1.0

Step by step

Provide clear examples in your prompt by including input-output pairs before the actual query. This guides the model to follow the pattern.

python
import os
from openai import OpenAI

client = OpenAI(api_key=os.environ["OPENAI_API_KEY"])

prompt = """
Translate English to French:

English: Hello, how are you?
French: Bonjour, comment ça va ?

English: What is your name?
French: Quel est ton nom ?

English: Where is the library?
French:"""

response = client.chat.completions.create(
    model="gpt-4o",
    messages=[{"role": "user", "content": prompt}]
)

print(response.choices[0].message.content.strip())
output
Où est la bibliothèque ?

Common variations

You can use examples with other models like claude-3-5-sonnet-20241022 or add multiple examples for better accuracy. Streaming or async calls are also possible depending on the SDK.

python
import anthropic
import os

client = anthropic.Anthropic(api_key=os.environ["ANTHROPIC_API_KEY"])

system_prompt = "You are a helpful assistant that translates English to French using examples."

user_prompt = """
Translate English to French:

English: Good morning
French: Bonjour

English: Thank you
French: Merci

English: See you later
French:"""

message = client.messages.create(
    model="claude-3-5-sonnet-20241022",
    max_tokens=50,
    system=system_prompt,
    messages=[{"role": "user", "content": user_prompt}]
)

print(message.content[0].text.strip())
output
À plus tard

Troubleshooting

If the model output ignores your examples, ensure your examples are clear, consistent, and formatted properly. Avoid mixing styles or ambiguous instructions. Also, check token limits to include all examples.

Key Takeaways

  • Include clear input-output examples in your prompt to guide the AI's response style and content.
  • Use few-shot prompting to improve accuracy without fine-tuning the model.
  • Format examples consistently and keep them concise to fit within token limits.
Verified 2026-04 · gpt-4o, claude-3-5-sonnet-20241022
Verify ↗