How to beginner · 3 min read

How to download a model in Ollama

Quick answer
To download a model in Ollama, use the CLI command ollama pull <model-name> which fetches the model locally for offline use. Alternatively, you can integrate with Ollama's Python SDK to programmatically manage models.

PREREQUISITES

  • Ollama CLI installed
  • Python 3.8+ (for SDK usage)
  • Ollama account (free)
  • pip install ollama (optional for Python SDK)

Setup

Install the Ollama CLI from https://ollama.com/download and create a free account. For Python integration, install the ollama package via pip.

bash
pip install ollama

Step by step

Use the ollama pull command to download a model locally. This example downloads the llama2 model.

python
import subprocess

# Download model using Ollama CLI
model_name = "llama2"
subprocess.run(["ollama", "pull", model_name], check=True)

print(f"Model '{model_name}' downloaded successfully.")
output
Model 'llama2' downloaded successfully.

Common variations

You can list available models with ollama list and specify different models to download. For programmatic use, the Ollama Python SDK allows querying and running models without manual downloads.

python
# List available models
subprocess.run(["ollama", "list"])

# Example: Download a different model
subprocess.run(["ollama", "pull", "gpt4all"])
output
llama2
mistral
alpaca
gpt4all

Troubleshooting

  • If ollama pull fails, check your internet connection and Ollama account login status with ollama login.
  • Ensure the model name is correct and available by running ollama list.
  • For permission errors, run the CLI with appropriate user privileges.

Key Takeaways

  • Use ollama pull <model-name> to download models locally via CLI.
  • Install the ollama Python package for programmatic model management.
  • Verify available models with ollama list before downloading.
  • Check your Ollama login status if downloads fail.
  • Model names and availability may change; always verify with the latest CLI commands.
Verified 2026-04 · llama2, gpt4all
Verify ↗