How to add authentication to FastAPI LLM endpoint
Quick answer
Add authentication to a
FastAPI LLM endpoint by implementing an API key check in a dependency or middleware that validates incoming requests before calling the LLM. Use Depends with a custom function to verify the API key from headers and raise HTTPException if unauthorized.PREREQUISITES
Python 3.8+FastAPIOpenAI API key (free tier works)pip install fastapi uvicorn openai>=1.0
Setup
Install fastapi and uvicorn for the web server, and the openai SDK for LLM calls. Set your OpenAI API key and your own API key for authentication as environment variables.
pip install fastapi uvicorn openai- Set environment variables:
export OPENAI_API_KEY='your_openai_key'export API_KEY='your_custom_api_key'
pip install fastapi uvicorn openai Step by step
Create a FastAPI app with an API key dependency that checks the X-API-Key header. If the key matches the expected value, proceed to call the OpenAI LLM endpoint using the openai SDK. Return the LLM response as JSON.
import os
from fastapi import FastAPI, Header, HTTPException, Depends
from openai import OpenAI
app = FastAPI()
# Load keys from environment
OPENAI_API_KEY = os.environ["OPENAI_API_KEY"]
API_KEY = os.environ["API_KEY"]
client = OpenAI(api_key=OPENAI_API_KEY)
def verify_api_key(x_api_key: str = Header(...)):
if x_api_key != API_KEY:
raise HTTPException(status_code=401, detail="Unauthorized")
@app.post("/llm")
async def llm_endpoint(prompt: str, api_key: None = Depends(verify_api_key)):
response = client.chat.completions.create(
model="gpt-4o-mini",
messages=[{"role": "user", "content": prompt}]
)
return {"response": response.choices[0].message.content}
# To run: uvicorn filename:app --reload Common variations
- Use
asynccalls withhttpxoropenaiasync client if supported. - Authenticate with OAuth2 or JWT tokens instead of API keys for more complex security.
- Switch to other LLM providers by changing the client initialization and model name.
Troubleshooting
- If you get
401 Unauthorized, verify theX-API-Keyheader is sent correctly. - Check environment variables are loaded properly.
- Ensure the OpenAI API key is valid and has access to the model.
Key Takeaways
- Use FastAPI dependencies to enforce API key authentication on LLM endpoints.
- Always load API keys from environment variables to avoid hardcoding secrets.
- Return clear HTTP 401 errors for unauthorized requests to secure your API.
- You can extend authentication to OAuth2 or JWT for advanced use cases.
- Use the official OpenAI SDK v1+ pattern for calling LLMs securely.