How to beginner · 3 min read

Together AI supported models list

Quick answer
Together AI supports models like meta-llama/Llama-3.3-70B-Instruct-Turbo accessible via their OpenAI-compatible API endpoint. Use the openai Python SDK with base_url="https://api.together.xyz/v1" and your API key to call these models.

PREREQUISITES

  • Python 3.8+
  • Together AI API key
  • pip install openai>=1.0

Setup

Install the openai Python package and set your Together AI API key as an environment variable.

  • Run pip install openai
  • Set TOGETHER_API_KEY in your environment
bash
pip install openai

Step by step

Use the OpenAI-compatible openai SDK with Together AI's base URL and your API key to call a supported model.

python
import os
from openai import OpenAI

client = OpenAI(api_key=os.environ["TOGETHER_API_KEY"], base_url="https://api.together.xyz/v1")

response = client.chat.completions.create(
    model="meta-llama/Llama-3.3-70B-Instruct-Turbo",
    messages=[{"role": "user", "content": "Hello, Together AI!"}]
)

print(response.choices[0].message.content)
output
Hello, Together AI! How can I assist you today?

Common variations

You can switch models by changing the model parameter to other supported Together AI models. Streaming responses and async calls are also supported via the openai SDK.

python
import asyncio
from openai import OpenAI

async def main():
    client = OpenAI(api_key=os.environ["TOGETHER_API_KEY"], base_url="https://api.together.xyz/v1")
    
    # Async streaming example
    stream = await client.chat.completions.acreate(
        model="meta-llama/Llama-3.3-70B-Instruct-Turbo",
        messages=[{"role": "user", "content": "Stream a hello message."}],
        stream=True
    )

    async for chunk in stream:
        print(chunk.choices[0].delta.content or "", end="", flush=True)

asyncio.run(main())
output
Hello! This is a streamed response from Together AI.

Troubleshooting

  • If you get authentication errors, verify your TOGETHER_API_KEY environment variable is set correctly.
  • For model not found errors, confirm the model name matches the current supported list.
  • Check network connectivity to https://api.together.xyz/v1.

Key Takeaways

  • Use the OpenAI-compatible openai SDK with base_url="https://api.together.xyz/v1" for Together AI models.
  • Supported models include meta-llama/Llama-3.3-70B-Instruct-Turbo and others following the meta-llama/ namespace.
  • Streaming and async calls are supported for efficient integration.
  • Always set your API key in the TOGETHER_API_KEY environment variable.
  • Verify model names and API endpoint if you encounter errors.
Verified 2026-04 · meta-llama/Llama-3.3-70B-Instruct-Turbo
Verify ↗