Closed miniwa closed 1 year ago
May I know what was the issue and how you fixed it?
I turned out to be a hyperparameter issue, and not a model merge issue. My bad. I used the example script here: https://github.com/cloneofsimo/lora/blob/master/training_scripts/use_face_conditioning_example.sh
In particular, having scale = 8 will fry the model very quickly.
Steps to reproduce:
lora_add runwayml/stable-diffusion-v1-5 ./lora.safetensors ./output/sd-merged-lora.ckpt 1 --mode upl-ckpt-v2
Any ideas?