How to install PEFT library
Quick answer
Install the
peft library using pip install peft to enable parameter-efficient fine-tuning methods like LoRA and QLoRA. This library integrates with Hugging Face Transformers for efficient model adaptation.PREREQUISITES
Python 3.8+pip installedHugging Face Transformers installed (pip install transformers)
Setup
Install the peft library via pip. It requires Python 3.8 or higher and works alongside Hugging Face Transformers. Make sure you have pip and transformers installed.
pip install peft Step by step
After installation, you can import peft in Python and start configuring LoRA or QLoRA adapters for your models. Here's a minimal example showing how to import and check the version.
import peft
print(f"PEFT version: {peft.__version__}") output
PEFT version: 0.5.0
Common variations
You can combine peft with transformers to load pretrained models and apply LoRA or QLoRA configurations. For example, use LoraConfig and get_peft_model to wrap your base model. Async usage is not typical for PEFT.
from transformers import AutoModelForCausalLM, AutoTokenizer
from peft import LoraConfig, get_peft_model
model_name = "meta-llama/Llama-3.1-8B-Instruct"
model = AutoModelForCausalLM.from_pretrained(model_name)
config = LoraConfig(r=16, lora_alpha=32, target_modules=["q_proj", "v_proj"], lora_dropout=0.05, task_type="CAUSAL_LM")
model = get_peft_model(model, config)
print("LoRA model ready") output
LoRA model ready
Troubleshooting
- If
pip install peftfails, ensure your Python version is 3.8 or higher. - Check that
transformersis installed and up to date. - For GPU support, verify your PyTorch installation matches your CUDA version.
Key Takeaways
- Use
pip install peftto install the PEFT library quickly. - PEFT integrates tightly with Hugging Face Transformers for LoRA and QLoRA fine-tuning.
- Ensure Python 3.8+ and Transformers are installed before using PEFT.
- Configure LoRA adapters with
LoraConfigandget_peft_model. - Troubleshoot by verifying Python version and dependencies if installation fails.