mkshing / ziplora-pytorch

Implementation of "ZipLoRA: Any Subject in Any Style by Effectively Merging LoRAs"
MIT License
481 stars 33 forks source link

AttributeError: 'Linear' object has no attribute 'set_lora_layer' #26

Open Jamie-Cheung opened 3 months ago

Jamie-Cheung commented 3 months ago

Whether I uninstall peft or not, I am still facing this bug, can someone tell me the solution.

lukasz-staniszewski commented 2 months ago

This repository hasn't been updated with the new diffusers version. In version 0.25.0, where, i.e. peft was introduced, the train_dreambooth_lora_sdxl.py file was entirely updated, and there is no set_lora_layer used there anymore.

I managed to run these older scripts (the ones in this repo) for both LoRA and ZipLoRA fine-tuning by downgrading most of the modules. Here is my requirements.txt file that enables me to run LoRA training in this repo; I hope it helps:

accelerate==0.29.3
bitsandbytes==0.43.1
diffusers==0.24.0
ftfy==6.2.0
huggingface-hub==0.22.2
Jinja2==3.1.3
numpy==1.26.4
pillow==10.3.0
pytorch-cuda==12.1
safetensors==0.4.3
tensorboard==2.16.2
torch==2.2.2
torchvision==0.17.2
tqdm==4.66.2
transformers==4.40.1
triton==2.2.0
wandb==0.16.6
xformers==0.0.25.post1