How to download models in LM Studio
Quick answer
To download models in
LM Studio, use the ollama pull <model_name> command in your terminal. This fetches the model locally for offline use and integration within LM Studio.PREREQUISITES
Ollama CLI installedLM Studio installedInternet connection for initial model download
Setup
Install the Ollama CLI to manage models and LM Studio for local AI interaction. Ensure you have an active internet connection for downloading models.
Install Ollama CLI on macOS or Linux via:
curl -fsSL https://ollama.com/install.sh | sh Step by step
Use the ollama pull command to download a model locally. For example, to download the llama2 model:
ollama pull llama2 output
Downloading llama2 model... Model llama2 downloaded successfully.
Common variations
You can list available models with ollama list and specify versions if supported. To use a downloaded model in LM Studio, select it from the model dropdown or specify it in your API calls.
ollama list
ollama pull llama2:13b
ollama run llama2:13b --prompt "Hello" output
Available models: - llama2:13b - llama2:7b Downloading llama2:13b model... Model llama2:13b downloaded successfully. Hello, how can I assist you today?
Troubleshooting
If the download fails, check your internet connection and ensure you have sufficient disk space. Use ollama doctor to diagnose common issues. If a model is not found, verify the model name and availability with ollama list.
ollama doctor output
Checking system... All systems operational.
Key Takeaways
- Use
ollama pull <model_name>to download models for local use in LM Studio. - List available models with
ollama listbefore downloading. - Troubleshoot with
ollama doctorif downloads fail or models are missing.