kijai / ComfyUI-FluxTrainer

Apache License 2.0
414 stars 19 forks source link

how to train multiple or all attn blocks #67

Open whmc76 opened 3 weeks ago

whmc76 commented 3 weeks ago

seems only train all attn blocks is a faster way for training lora. but it seems not work if i wrote attn here image

whmc76 commented 3 weeks ago

image

in my case, is it working?

     INFO     train all blocks only                                                      lora_flux.py:502
                INFO     create LoRA for Text Encoder 1:                                            lora_flux.py:589
                INFO     create LoRA for Text Encoder 1: 72 modules.                                lora_flux.py:592
                INFO     create LoRA for FLUX all blocks: 114 modules.                              lora_flux.py:606
                INFO     enable LoRA for U-Net: 114 modules                                         lora_flux.py:75
kijai commented 3 weeks ago

If you are referring to what civit ai LoRAs seem to have in rapid mode, you'd select the linear1 layers of all single blocks.

whmc76 commented 3 weeks ago

If you are referring to what civit ai LoRAs seem to have in rapid mode, you'd select the linear1 layers of all single blocks.

Haha, that's exactly what I was going to do, so it looks like I need to list all the blocks line by line. I have another question. This looks like a series of strings connected with ',' instead of multiple lines of text. I hope you can confirm or correct it. Thank you very much.

plugcrypt commented 6 days ago

I made two nodes for L1 and L2 blocks that you can enable or disable, you can see here https://civitai.com/models/743615 The github repo https://github.com/plugcrypt/CRT-Nodes Hope that could help

whmc76 commented 5 days ago

I made two nodes for L1 and L2 blocks that you can enable or disable, you can see here https://civitai.com/models/743615 The github repo https://github.com/plugcrypt/CRT-Nodes Hope that could help

very cool!