Fireworks AI supported models
Quick answer
Fireworks AI supports models with names starting with accounts/fireworks/models/, such as accounts/fireworks/models/llama-v3p3-70b-instruct. Use the OpenAI SDK with the base_url set to https://api.fireworks.ai/inference/v1 and your Fireworks API key for authentication.
PREREQUISITES
Python 3.8+Fireworks AI API keypip install openai>=1.0
Setup
Install the openai Python package (version 1.0 or higher) and set your Fireworks AI API key as an environment variable.
- Install OpenAI SDK:
pip install openai - Set environment variable:
export FIREWORKS_API_KEY='your_api_key'(Linux/macOS) orset FIREWORKS_API_KEY=your_api_key(Windows)
pip install openai output
Collecting openai Downloading openai-1.x.x-py3-none-any.whl (xx kB) Installing collected packages: openai Successfully installed openai-1.x.x
Step by step
Use the OpenAI SDK with Fireworks AI by specifying the base_url and your API key. The model names always start with accounts/fireworks/models/. Below is a complete example to call the llama-v3p3-70b-instruct model.
import os
from openai import OpenAI
client = OpenAI(
api_key=os.environ["FIREWORKS_API_KEY"],
base_url="https://api.fireworks.ai/inference/v1"
)
response = client.chat.completions.create(
model="accounts/fireworks/models/llama-v3p3-70b-instruct",
messages=[{"role": "user", "content": "Hello, Fireworks AI!"}]
)
print(response.choices[0].message.content) output
Hello, Fireworks AI! How can I assist you today?
Common variations
You can switch to other supported Fireworks AI models by changing the model parameter. Common models include:
| Model name |
|---|
| accounts/fireworks/models/llama-v3p3-70b-instruct |
| accounts/fireworks/models/deepseek-r1 |
| accounts/fireworks/models/mixtral-8x7b-instruct |
Troubleshooting
- If you get authentication errors, verify your
FIREWORKS_API_KEYenvironment variable is set correctly. - If the model is not found, ensure you use the full model name starting with
accounts/fireworks/models/. - For network issues, check your internet connection and firewall settings.
Key Takeaways
- Fireworks AI models require the base_url parameter set to https://api.fireworks.ai/inference/v1 in the OpenAI SDK.
- Model names always start with accounts/fireworks/models/ followed by the specific model identifier.
- Use environment variables for API keys to keep credentials secure and avoid hardcoding.
- Common Fireworks AI models include llama-v3p3-70b-instruct, deepseek-r1, and mixtral-8x7b-instruct.
- Check environment variable setup and model naming if you encounter errors.