How to use Hugging Face hub in python
Direct answer
Use the
huggingface_hub Python library to authenticate, download, and interact with models from the Hugging Face Hub programmatically.Setup
Install
pip install huggingface_hub Env vars
HF_API_TOKEN Imports
from huggingface_hub import HfApi, hf_hub_download
import os Examples
inDownload the 'bert-base-uncased' tokenizer files
outFiles downloaded locally with their paths returned
inList models under the 'facebook' organization
outList of model names like ['facebook/bart-large', 'facebook/mbart-large-50']
inDownload a specific file from a private repo using token
outFile downloaded successfully with access token authentication
Integration steps
- Install the huggingface_hub package via pip
- Set your Hugging Face API token in the HF_API_TOKEN environment variable
- Import HfApi and hf_hub_download from huggingface_hub
- Initialize HfApi client with the token from environment
- Use HfApi methods to list or get model info
- Use hf_hub_download to download model files locally
Full code
from huggingface_hub import HfApi, hf_hub_download
import os
# Load your Hugging Face API token from environment
api_token = os.environ.get('HF_API_TOKEN')
# Initialize the API client
api = HfApi(token=api_token)
# Example 1: List models by an organization
models = api.list_models(author='facebook', limit=5)
print('Facebook models:')
for model in models:
print(f'- {model.modelId}')
# Example 2: Download a file from a model repo
model_id = 'bert-base-uncased'
filename = 'config.json'
file_path = hf_hub_download(repo_id=model_id, filename=filename, token=api_token)
print(f'Downloaded {filename} to {file_path}') output
Facebook models: - facebook/bart-large - facebook/mbart-large-50 - facebook/wav2vec2-base-960h - facebook/detr-resnet-50 - facebook/bart-large-mnli Downloaded config.json to /home/user/.cache/huggingface/hub/models--bert-base-uncased/config.json
API trace
Request
{"author": "facebook", "limit": 5, "token": "<your_token>"} for list_models; {"repo_id": "bert-base-uncased", "filename": "config.json", "token": "<your_token>"} for hf_hub_download Response
{"models": [{"modelId": "facebook/bart-large"}, ...]} for list_models; local file path string for hf_hub_download Extract
For list_models: iterate response.models and access modelId; for hf_hub_download: use returned file path stringVariants
Streaming download with progress bar ›
Use when downloading large files to support resuming and show progress.
from huggingface_hub import hf_hub_download
import os
api_token = os.environ.get('HF_API_TOKEN')
file_path = hf_hub_download(
repo_id='bert-base-uncased',
filename='pytorch_model.bin',
token=api_token,
resume_download=True
)
print(f'Downloaded model weights to {file_path}') Anonymous access without token ›
Use for public models where authentication is not required.
from huggingface_hub import hf_hub_download
file_path = hf_hub_download(repo_id='bert-base-uncased', filename='config.json')
print(f'Downloaded {file_path}') Using HfApi to upload a model file ›
Use when you want to programmatically upload files to your Hugging Face model repository.
from huggingface_hub import HfApi
import os
api_token = os.environ.get('HF_API_TOKEN')
api = HfApi(token=api_token)
api.upload_file(
path_or_fileobj='local_model.bin',
path_in_repo='model.bin',
repo_id='your-username/your-model',
repo_type='model'
)
print('File uploaded successfully') Performance
Latency~200-500ms for metadata requests; file downloads depend on file size and network speed
CostFree for public models; private repo access requires a valid token but no additional cost
Rate limitsHugging Face enforces rate limits per user/token; typically 60 requests per minute for API calls
- Cache downloaded files locally to avoid repeated downloads
- Use specific filenames to minimize data transfer
- Limit list_models queries with 'limit' parameter to reduce overhead
| Approach | Latency | Cost/call | Best for |
|---|---|---|---|
| HfApi list_models | ~200ms | Free | Listing models and metadata |
| hf_hub_download | Varies by file size | Free | Downloading model files |
| Anonymous access | ~200ms | Free | Public models without authentication |
Quick tip
Always store your Hugging Face API token in an environment variable and pass it to the client to access private models securely.
Common mistake
Forgetting to set or pass the HF_API_TOKEN environment variable causes authentication errors when accessing private repos.