luosiallen / latent-consistency-model

Latent Consistency Models: Synthesizing High-Resolution Images with Few-Step Inference
MIT License
4.29k stars 221 forks source link

unet_lora checkpoints of training_scripts can not be directly used #78

Open xingyueye opened 9 months ago

xingyueye commented 9 months ago

I have re-train the LCM-LoRA with scripts[https://github.com/luosiallen/latent-consistency-model/blob/main/LCM_Training_Script/consistency_distillation/train_lcm_distill_lora_sd_wds.py] However, the saved checkpoint can not be directly used because of the key mismatching. Is there a continued conversion process to make checkpoints the same format as your released version?

lileilai commented 6 months ago

@xingyueye i have face the same problem, have you find an solutions?

Pakase commented 1 week ago

Same issue. It's weird. Any solution?