How to start LiteLLM proxy server
Quick answer
To start the
LiteLLM proxy server, install the litellm package via pip, then run litellm proxy from your terminal. This launches a local proxy server that routes requests to your configured LLM backend.PREREQUISITES
Python 3.8+pip install litellmConfigured LiteLLM backend or model
Setup
Install the litellm Python package using pip. Ensure you have Python 3.8 or higher installed. No API key is required to run the proxy server locally.
pip install litellm Step by step
Run the LiteLLM proxy server with the default configuration. This command starts a local HTTP proxy on port 11434 that forwards requests to your LiteLLM backend.
litellm proxy output
Starting LiteLLM proxy server on http://localhost:11434 Proxy is now running and ready to accept requests.
Common variations
You can specify a different port or backend URL using command-line options. For example, litellm proxy --port 8080 --backend-url http://localhost:8000 starts the proxy on port 8080 forwarding to a backend at port 8000.
Use litellm proxy --help to see all available options.
litellm proxy --port 8080 --backend-url http://localhost:8000 output
Starting LiteLLM proxy server on http://localhost:8080 Proxy forwarding to backend at http://localhost:8000 Proxy is now running.
Troubleshooting
- If the proxy fails to start because the port is in use, choose a different port with
--port. - If requests fail, verify your backend URL is correct and the backend server is running.
- Check firewall or network settings that might block local connections.
Key Takeaways
- Install LiteLLM with pip to get the proxy server command-line tool.
- Run
litellm proxyto start the local proxy server on port 11434 by default. - Customize port and backend URL with command-line flags for flexible setups.
- Use
litellm proxy --helpto explore all configuration options. - Check port availability and backend server status if the proxy fails to start or forward requests.