haofanwang / Lora-for-Diffusers

The most easy-to-understand tutorial for using LoRA (Low-Rank Adaptation) within diffusers framework for AI Generation Researchers🔥
MIT License
739 stars 46 forks source link

size mismatch using the converted .bin file #22

Open XiaoyuShi97 opened 10 months ago

XiaoyuShi97 commented 10 months ago

Hi, thanks a lot for your great work. I am converting LoRA file of safetensor format downloaded from civitai using your format_convert.py. Then I load the converted .bin file using pipe.unet.load_attn_procs. But I get following error:

RuntimeError: Errors in loading state_dict for LoRACrossAttnProcessor:

size mismatch for to_q_lora.down.weight: copying a param with shape torch.size(128,320) from checkpoint, the shape in current model is torch.size(4, 320).

It seems to be related to the config of unet's attn processor, but I could not find corresponding documents. Could you please provides some suggestions?