How to beginner · 3 min read

How to list available models in Ollama

Quick answer
To list available models in Ollama, use the ollama models CLI command or call the Ollama API endpoint that returns model info. The CLI is the simplest way to see all installed models locally.

PREREQUISITES

  • Python 3.8+
  • Ollama installed and configured locally
  • Access to terminal/command line

Setup

Install Ollama on your machine by following the official instructions at https://ollama.com/docs/install. Ensure you have Python 3.8+ installed for scripting. No additional Python packages are required to list models via CLI.

bash
ollama --version
output
ollama version 0.0.1

Step by step

Use the Ollama CLI to list all available models installed locally. This command outputs the model names and versions.

python
import subprocess

# Run the 'ollama models' command to list available models
result = subprocess.run(['ollama', 'models'], capture_output=True, text=True)
print(result.stdout)
output
llama2
mistral
wizardlm

Common variations

You can also list models programmatically by calling Ollama's local API if available, or parse the CLI output for integration in Python scripts. For asynchronous usage, use asyncio.create_subprocess_exec to run the CLI command.

python
import asyncio

async def list_models_async():
    proc = await asyncio.create_subprocess_exec(
        'ollama', 'models',
        stdout=asyncio.subprocess.PIPE,
        stderr=asyncio.subprocess.PIPE
    )
    stdout, stderr = await proc.communicate()
    if stderr:
        print(f'Error: {stderr.decode()}')
    else:
        print(stdout.decode())

asyncio.run(list_models_async())
output
llama2
mistral
wizardlm

Troubleshooting

  • If ollama models returns an error, ensure Ollama is installed and your PATH includes the Ollama binary.
  • If no models appear, install models using ollama pull <model-name>.
  • For permission errors, run the terminal as administrator or check your user permissions.

Key Takeaways

  • Use the ollama models CLI command to quickly list all installed Ollama models.
  • You can integrate model listing in Python by running the CLI via subprocess or asyncio.
  • Ensure Ollama is properly installed and your environment PATH is configured to avoid command errors.
Verified 2026-04 · llama2, mistral, wizardlm
Verify ↗