Open whmc76 opened 3 weeks ago
in my case, is it working?
INFO train all blocks only lora_flux.py:502
INFO create LoRA for Text Encoder 1: lora_flux.py:589
INFO create LoRA for Text Encoder 1: 72 modules. lora_flux.py:592
INFO create LoRA for FLUX all blocks: 114 modules. lora_flux.py:606
INFO enable LoRA for U-Net: 114 modules lora_flux.py:75
If you are referring to what civit ai LoRAs seem to have in rapid mode, you'd select the linear1 layers of all single blocks.
If you are referring to what civit ai LoRAs seem to have in rapid mode, you'd select the linear1 layers of all single blocks.
Haha, that's exactly what I was going to do, so it looks like I need to list all the blocks line by line. I have another question. This looks like a series of strings connected with ',' instead of multiple lines of text. I hope you can confirm or correct it. Thank you very much.
I made two nodes for L1 and L2 blocks that you can enable or disable, you can see here https://civitai.com/models/743615 The github repo https://github.com/plugcrypt/CRT-Nodes Hope that could help
I made two nodes for L1 and L2 blocks that you can enable or disable, you can see here https://civitai.com/models/743615 The github repo https://github.com/plugcrypt/CRT-Nodes Hope that could help
very cool!
seems only train all attn blocks is a faster way for training lora. but it seems not work if i wrote attn here