Closed kopyl closed 8 months ago
try with more basic setup with removing --mixed_precision and use tensorboard rather than wandb
Can you try this again? We recently performed some upgrades to the scripts.
Can you try this again? We recently performed some upgrades to the scripts.
can not do that right now, will do later, thanks.
What did you change exactly in the script?
try with more basic setup with removing --mixed_precision and use tensorboard rather than wandb
indeed removing --mixed_precision
helped.
But now i got twice as slow training :(
Is there any ways to still use --mixed_precision
as shown in the official Diffusers guide?
Here is the error i'm getting: https://pastebin.com/1H1dMF5q
@maliozer
Removing --mixed_precision
shouldn't have to be necessary.
See Colab: https://colab.research.google.com/gist/sayakpaul/ce4eb75b1e5751f4d389a086ca2880a2/scratchpad.ipynb
This issue has been automatically marked as stale because it has not had recent activity. If you think this still needs to be addressed please comment on this thread.
Please note that issues that do not follow the contributing guidelines are likely to be ignored.
I have to awake this issue again, I have tried the lora example with and without lora weights with same seed (42) with no change on generated images. It seems this image comes from only pretrained stable diffusion itself.
0_nolora_blue_pokemon.png
1_with_lora_blue_pokemon.png same image as above.
I shared a colab: https://colab.research.google.com/drive/1l_gDFGgRF5PDad3lDLHoF3SrEFqJ4jDq?usp=sharing @sayakpaul
I don't have access to "/content/drive/MyDrive/research/temp/lora/pokemon/checkpoint-100" in your notebook.
pipe.unet.load_attn_procs(f"checkpoint_path", adapter_name="pokemon")
pipe.unet.set_adapters(["pokemon"], weights=[1.0])
pipe.unet.fuse_lora()
has no effect but
pipe.unload_lora_weights()
pipe.load_lora_weights("checkpoint_path")
works properly. I can move with this method but not sure for further issues related to set_adapters, fuse_lora . for your info.
pipe.unet.load_attn_procs(f"checkpoint_path", adapter_name="pokemon")
Instead of this, use load_lora_weights()
on the pipe
object?
how to fuse multiple adapters without set_adapters
?
Why would you not want to use that?
Ok I got the process now :smile:
Describe the bug
I ran into this bug: https://github.com/huggingface/diffusers/issues/5897 so I used this for the training: https://github.com/huggingface/diffusers/blob/1477865e4838d887bb93750dc325e10f1e6ae534/examples/text_to_image/train_text_to_image_lora.py
Reproduction
Install:
Train:
Logs
No response
System Info
RTX 4090, Linux
Who can help?
No response