city96 / ComfyUI_ExtraModels

Support for miscellaneous image models. Currently supports: DiT, PixArt, HunYuanDiT, MiaoBi, and a few VAEs.
Apache License 2.0
394 stars 35 forks source link

Issue loading a PEFT lora model #70

Open frutiemax92 opened 5 months ago

frutiemax92 commented 5 months ago

When trying to load this PEFT Lora model I have multiple issues. https://www.dropbox.com/scl/fi/30y9yn26ao8pnwch7z1ex/test_lora.zip?rlkey=r6kvgzwvrqm9tnw4jz8ctgu2f&st=x8r69pb3&dl=0

The first is that the code crashes at the line k = new_modelpatcher.add_patches(loaded, strength) in lora.py when using the automatic cfg node after loading the lora.

Also, when I am trying to load the Lora, it doesn't find all the layers it needs to patch i.e. only 14 layers out of 574of them... There is an image generating but it doesn't load the lora correctly!

PixArt: LoRA conversion has leftover keys! (14 vs 574)
['transformer_blocks.0.attn1.to_k.lora_A.weight', 'transformer_blocks.8.attn2.to_q.lora_B.weight', 'transformer_blocks.1.attn1.to_q.lora_B.weight', 'transformer_blocks.23.attn1.to_out.0.lora_A.weight', 'transformer_blocks.15.attn1.to_k.lora_A.weight', 'transformer_blocks.19.attn2.to_k.lora_B.weight', 'transformer_blocks.4.attn1.to_v.lora_A.weight', 'transformer_blocks.22.attn2.to_out.0.lora_A.weight', 'transformer_blocks.7.ff.net.0.proj.lora_B.weight', 'transformer_blocks.9.attn1.to_out.0.lora_A.weight', 'transformer_blocks.7.attn1.to_k.lora_B.weight', 'transformer_blocks.16.attn1.to_out.0.lora_A.weight', 'transformer_blocks.2.attn2.to_out.0.lora_A.weight', 'transformer_blocks.3.attn2.to_k.lora_A.weight', 'transformer_blocks.9.attn2.to_q.lora_B.weight', 'transformer_blocks.1.attn2.to_k.lora_B.weight', 'transformer_blocks.26.attn2.to_q.lora_B.weight', 'transformer_blocks.10.attn1.to_v.lora_A.weight', 'transformer_blocks.2.attn1.to_out.0.lora_B.weight', 'transformer_blocks.5.attn1.to_q.lora_A.weight', 'transformer_blocks.25.attn1.to_k.lora_B.weight', 'transformer_blocks.9.attn1.to_k.lora_B.weight', 'transformer_blocks.12.attn1.to_k.lora_B.weight', 'transformer_blocks.25.ff.net.0.proj.lora_B.weight', 'transformer_blocks.27.attn1.to_v.lora_A.weight', 'transformer_blocks.1.attn1.to_out.0.lora_A.weight', 'transformer_blocks.8.ff.net.2.lora_A.weight', 'transformer_blocks.18.attn1.to_out.0.lora_B.weight', 'transformer_blocks.12.attn2.to_v.lora_A.weight', 'transformer_blocks.0.attn1.to_out.0.lora_B.weight', 'transformer_blocks.11.attn2.to_q.lora_A.weight', 'transformer_blocks.9.ff.net.2.lora_B.weight', 'transformer_blocks.7.attn1.to_q.lora_A.weight', 'transformer_blocks.11.attn1.to_v.lora_A.weight', 'transformer_blocks.20.attn2.to_out.0.lora_A.weight', 'transformer_blocks.26.attn2.to_q.lora_A.weight', 'transformer_blocks.13.attn2.to_q.lora_B.weight', 'transformer_blocks.6.attn2.to_k.lora_B.weight', 'transformer_blocks.15.attn1.to_v.lora_A.weight', 'transformer_blocks.16.attn2.to_out.0.lora_B.weight', 'transformer_blocks.27.attn2.to_k.lora_B.weight', 'transformer_blocks.7.attn2.to_out.0.lora_A.weight', 'transformer_blocks.7.attn2.to_out.0.lora_B.weight', 'transformer_blocks.20.attn1.to_v.lora_B.weight', 'transformer_blocks.25.attn2.to_q.lora_B.weight', 'transformer_blocks.17.ff.net.0.proj.lora_B.weight', 'transformer_blocks.15.ff.net.2.lora_B.weight', 'transformer_blocks.8.attn2.to_v.lora_B.weight', 'transformer_blocks.19.ff.net.2.lora_B.weight', 'transformer_blocks.11.attn2.to_k.lora_B.weight', 'transformer_blocks.13.attn2.to_out.0.lora_B.weight', 'transformer_blocks.5.ff.net.2.lora_A.weight', 'transformer_blocks.6.attn1.to_out.0.lora_B.weight', 'transformer_blocks.12.attn1.to_out.0.lora_A.weight', 'transformer_blocks.19.attn2.to_q.lora_A.weight', 'transformer_blocks.2.ff.net.2.lora_A.weight', 'transformer_blocks.0.attn2.to_out.0.lora_B.weight', 'transformer_blocks.10.ff.net.0.proj.lora_A.weight', 'transformer_blocks.15.attn1.to_q.lora_B.weight', 'transformer_blocks.17.ff.net.0.proj.lora_A.weight', 'transformer_blocks.22.ff.net.2.lora_B.weight', 'transformer_blocks.6.attn1.to_q.lora_A.weight', 'transformer_blocks.3.attn1.to_q.lora_B.weight', 'transformer_blocks.17.attn2.to_q.lora_A.weight', 'transformer_blocks.25.attn1.to_k.lora_A.weight', 'transformer_blocks.4.attn1.to_out.0.lora_A.weight', 'transformer_blocks.4.attn2.to_q.lora_A.weight', 'transformer_blocks.9.attn1.to_v.lora_A.weight', 'transformer_blocks.1.attn2.to_q.lora_A.weight', 'transformer_blocks.1.attn1.to_v.lora_A.weight', 'transformer_blocks.18.attn2.to_out.0.lora_A.weight', 'transformer_blocks.6.attn1.to_v.lora_B.weight', 'transformer_blocks.8.attn2.to_out.0.lora_A.weight', 'transformer_blocks.0.attn1.to_out.0.lora_A.weight', 'transformer_blocks.6.attn1.to_out.0.lora_A.weight', 'transformer_blocks.21.ff.net.2.lora_A.weight', 'transformer_blocks.17.attn1.to_q.lora_B.weight', 'transformer_blocks.22.attn2.to_k.lora_A.weight', 'transformer_blocks.25.attn2.to_v.lora_A.weight', 'transformer_blocks.23.attn2.to_v.lora_A.weight', 'transformer_blocks.15.attn2.to_q.lora_B.weight', 'transformer_blocks.27.ff.net.2.lora_A.weight', 'transformer_blocks.13.attn1.to_q.lora_B.weight', 'transformer_blocks.15.ff.net.2.lora_A.weight', 'transformer_blocks.16.attn1.to_v.lora_A.weight', 'transformer_blocks.12.ff.net.2.lora_B.weight', 'transformer_blocks.16.attn1.to_q.lora_A.weight', 'transformer_blocks.17.attn2.to_v.lora_B.weight', 'transformer_blocks.11.ff.net.2.lora_B.weight', 'transformer_blocks.1.attn2.to_out.0.lora_B.weight', 'transformer_blocks.23.attn2.to_q.lora_B.weight', 'transformer_blocks.27.attn1.to_k.lora_A.weight', 'transformer_blocks.23.attn1.to_v.lora_A.weight', 'transformer_blocks.3.attn1.to_v.lora_B.weight', 'transformer_blocks.24.attn2.to_q.lora_B.weight', 'transformer_blocks.2.attn1.to_v.lora_A.weight', 'transformer_blocks.1.attn2.to_q.lora_B.weight', 'transformer_blocks.18.ff.net.0.proj.lora_B.weight', 'transformer_blocks.24.attn2.to_v.lora_A.weight', 'transformer_blocks.27.attn2.to_v.lora_B.weight', 'transformer_blocks.6.ff.net.0.proj.lora_A.weight', 'transformer_blocks.8.attn2.to_out.0.lora_B.weight', 'transformer_blocks.23.ff.net.0.proj.lora_B.weight', 'transformer_blocks.24.attn1.to_q.lora_B.weight', 'transformer_blocks.26.attn1.to_v.lora_B.weight', 'transformer_blocks.0.attn1.to_q.lora_A.weight', 'transformer_blocks.13.attn2.to_q.lora_A.weight', 'transformer_blocks.20.attn1.to_out.0.lora_B.weight', 'transformer_blocks.7.attn1.to_out.0.lora_B.weight', 'transformer_blocks.27.attn1.to_out.0.lora_A.weight', 'transformer_blocks.4.ff.net.0.proj.lora_B.weight', 'transformer_blocks.10.attn1.to_q.lora_A.weight', 'transformer_blocks.1.attn2.to_k.lora_A.weight', 'transformer_blocks.15.attn2.to_out.0.lora_A.weight', 'transformer_blocks.0.attn2.to_v.lora_B.weight', 'transformer_blocks.13.attn1.to_v.lora_B.weight', 'transformer_blocks.3.ff.net.2.lora_A.weight', 'transformer_blocks.9.attn1.to_q.lora_A.weight', 'transformer_blocks.14.ff.net.2.lora_B.weight', 'transformer_blocks.18.attn1.to_v.lora_B.weight', 'transformer_blocks.23.attn2.to_out.0.lora_A.weight', 'transformer_blocks.16.attn2.to_q.lora_B.weight', 'transformer_blocks.27.attn1.to_k.lora_B.weight', 'transformer_blocks.3.attn1.to_v.lora_A.weight', 'transformer_blocks.27.attn1.to_v.lora_B.weight', 'transformer_blocks.11.attn1.to_q.lora_A.weight', 'transformer_blocks.15.attn2.to_q.lora_A.weight', 'transformer_blocks.1.ff.net.2.lora_B.weight', 'transformer_blocks.23.attn1.to_k.lora_A.weight', 'transformer_blocks.26.attn1.to_q.lora_A.weight', 'transformer_blocks.4.attn2.to_k.lora_B.weight', 'transformer_blocks.3.attn2.to_q.lora_B.weight', 'transformer_blocks.9.attn2.to_out.0.lora_A.weight', 'transformer_blocks.20.attn2.to_v.lora_A.weight', 'transformer_blocks.24.ff.net.2.lora_A.weight', 'transformer_blocks.11.attn1.to_out.0.lora_A.weight', 'transformer_blocks.3.attn2.to_v.lora_B.weight', 'transformer_blocks.21.attn1.to_q.lora_A.weight', 'transformer_blocks.17.ff.net.2.lora_B.weight', 'transformer_blocks.15.attn1.to_k.lora_B.weight', 'transformer_blocks.25.attn1.to_v.lora_B.weight', 'transformer_blocks.22.ff.net.0.proj.lora_A.weight', 'transformer_blocks.25.attn1.to_out.0.lora_B.weight', 'transformer_blocks.5.attn2.to_v.lora_A.weight', 'transformer_blocks.16.attn2.to_k.lora_B.weight', 'transformer_blocks.26.attn1.to_k.lora_B.weight', 'transformer_blocks.3.attn2.to_k.lora_B.weight', 'transformer_blocks.10.attn1.to_out.0.lora_B.weight', 'transformer_blocks.19.attn2.to_k.lora_A.weight', 'transformer_blocks.6.attn2.to_v.lora_A.weight', 'transformer_blocks.12.attn2.to_v.lora_B.weight', 'transformer_blocks.21.attn1.to_v.lora_A.weight', 'transformer_blocks.7.ff.net.0.proj.lora_A.weight', 'transformer_blocks.24.attn2.to_v.lora_B.weight', 'transformer_blocks.4.ff.net.2.lora_A.weight', 'transformer_blocks.15.attn2.to_v.lora_B.weight', 'transformer_blocks.17.attn1.to_v.lora_A.weight', 'transformer_blocks.2.attn2.to_k.lora_B.weight', 'transformer_blocks.5.attn1.to_v.lora_A.weight', 'transformer_blocks.2.attn2.to_q.lora_B.weight', 'transformer_blocks.25.attn2.to_q.lora_A.weight', 'transformer_blocks.26.ff.net.2.lora_B.weight', 'transformer_blocks.23.attn1.to_q.lora_A.weight', 'transformer_blocks.16.attn2.to_k.lora_A.weight', 'transformer_blocks.19.attn1.to_k.lora_A.weight', 'transformer_blocks.12.ff.net.0.proj.lora_B.weight', 'transformer_blocks.20.ff.net.0.proj.lora_B.weight', 'transformer_blocks.16.attn2.to_v.lora_B.weight', 'transformer_blocks.21.attn1.to_out.0.lora_A.weight', 'transformer_blocks.21.attn2.to_v.lora_B.weight', 'transformer_blocks.16.attn1.to_out.0.lora_B.weight', 'transformer_blocks.9.attn2.to_q.lora_A.weight', 'transformer_blocks.5.ff.net.0.proj.lora_A.weight', 'transformer_blocks.26.attn2.to_k.lora_B.weight', 'transformer_blocks.19.attn2.to_v.lora_B.weight', 'transformer_blocks.17.attn2.to_out.0.lora_B.weight', 'transformer_blocks.6.attn1.to_q.lora_B.weight', 'transformer_blocks.21.attn1.to_q.lora_B.weight', 'transformer_blocks.26.attn1.to_k.lora_A.weight', 'transformer_blocks.9.attn2.to_k.lora_A.weight', 'transformer_blocks.1.ff.net.0.proj.lora_A.weight', 'transformer_blocks.22.attn1.to_out.0.lora_A.weight', 'transformer_blocks.11.ff.net.2.lora_A.weight', 'transformer_blocks.0.ff.net.2.lora_B.weight', 'transformer_blocks.14.attn1.to_v.lora_A.weight', 'transformer_blocks.7.attn1.to_v.lora_A.weight', 'transformer_blocks.1.ff.net.2.lora_A.weight', 'transformer_blocks.13.attn1.to_out.0.lora_A.weight', 'transformer_blocks.20.ff.net.2.lora_A.weight', 'transformer_blocks.2.ff.net.0.proj.lora_B.weight', 'transformer_blocks.11.attn2.to_out.0.lora_B.weight', 'transformer_blocks.8.attn1.to_out.0.lora_B.weight', 'transformer_blocks.4.attn2.to_v.lora_A.weight', 'transformer_blocks.20.attn2.to_q.lora_A.weight', 'transformer_blocks.1.attn2.to_v.lora_B.weight', 'transformer_blocks.14.attn2.to_q.lora_B.weight', 'transformer_blocks.23.attn1.to_q.lora_B.weight', 'transformer_blocks.26.attn1.to_out.0.lora_A.weight', 'transformer_blocks.16.attn1.to_k.lora_A.weight', 'transformer_blocks.9.attn1.to_v.lora_B.weight', 'transformer_blocks.22.attn2.to_v.lora_B.weight', 'transformer_blocks.3.attn2.to_q.lora_A.weight', 'transformer_blocks.24.attn1.to_q.lora_A.weight', 'transformer_blocks.18.attn2.to_v.lora_A.weight', 'transformer_blocks.9.ff.net.0.proj.lora_B.weight', 'transformer_blocks.25.attn2.to_k.lora_A.weight', 'transformer_blocks.12.attn2.to_k.lora_A.weight', 'transformer_blocks.20.attn1.to_v.lora_A.weight', 'transformer_blocks.17.attn2.to_k.lora_B.weight', 'transformer_blocks.16.ff.net.2.lora_B.weight', 'transformer_blocks.0.attn2.to_v.lora_A.weight', 'transformer_blocks.17.attn1.to_q.lora_A.weight', 'transformer_blocks.18.attn2.to_v.lora_B.weight', 'transformer_blocks.7.ff.net.2.lora_B.weight', 'transformer_blocks.0.attn2.to_out.0.lora_A.weight', 'transformer_blocks.20.attn2.to_q.lora_B.weight', 'transformer_blocks.25.attn2.to_v.lora_B.weight', 'transformer_blocks.26.attn2.to_v.lora_B.weight', 'transformer_blocks.3.ff.net.0.proj.lora_B.weight', 'transformer_blocks.11.attn1.to_v.lora_B.weight', 'transformer_blocks.8.attn2.to_v.lora_A.weight', 'transformer_blocks.10.ff.net.2.lora_B.weight', 'transformer_blocks.7.ff.net.2.lora_A.weight', 'transformer_blocks.17.attn1.to_out.0.lora_B.weight', 'transformer_blocks.24.ff.net.0.proj.lora_A.weight', 'transformer_blocks.21.attn2.to_v.lora_A.weight', 'transformer_blocks.14.attn2.to_v.lora_B.weight', 'transformer_blocks.13.attn2.to_k.lora_A.weight', 'transformer_blocks.24.attn1.to_k.lora_A.weight', 'transformer_blocks.23.attn1.to_k.lora_B.weight', 'transformer_blocks.23.ff.net.2.lora_A.weight', 'transformer_blocks.3.ff.net.2.lora_B.weight', 'transformer_blocks.14.attn1.to_k.lora_A.weight', 'transformer_blocks.23.attn2.to_k.lora_B.weight', 'transformer_blocks.19.attn1.to_out.0.lora_A.weight', 'transformer_blocks.6.attn2.to_v.lora_B.weight', 'transformer_blocks.1.attn2.to_out.0.lora_A.weight', 'transformer_blocks.19.attn1.to_v.lora_A.weight', 'transformer_blocks.2.attn1.to_out.0.lora_A.weight', 'transformer_blocks.16.attn1.to_v.lora_B.weight', 'transformer_blocks.13.ff.net.0.proj.lora_B.weight', 'transformer_blocks.11.attn1.to_out.0.lora_B.weight', 'transformer_blocks.25.attn1.to_v.lora_A.weight', 'transformer_blocks.0.attn2.to_k.lora_B.weight', 'transformer_blocks.12.ff.net.0.proj.lora_A.weight', 'transformer_blocks.15.ff.net.0.proj.lora_B.weight', 'transformer_blocks.6.ff.net.0.proj.lora_B.weight', 'transformer_blocks.10.attn2.to_out.0.lora_A.weight', 'transformer_blocks.14.attn2.to_out.0.lora_B.weight', 'transformer_blocks.20.attn2.to_out.0.lora_B.weight', 'transformer_blocks.15.attn2.to_v.lora_A.weight', 'transformer_blocks.18.attn1.to_out.0.lora_A.weight', 'transformer_blocks.5.attn2.to_k.lora_B.weight', 'transformer_blocks.8.attn1.to_k.lora_B.weight', 'transformer_blocks.25.ff.net.0.proj.lora_A.weight', 'transformer_blocks.17.attn2.to_out.0.lora_A.weight', 'transformer_blocks.18.attn1.to_q.lora_B.weight', 'transformer_blocks.0.ff.net.0.proj.lora_A.weight', 'transformer_blocks.17.attn2.to_k.lora_A.weight', 'transformer_blocks.26.ff.net.2.lora_A.weight', 'transformer_blocks.4.ff.net.0.proj.lora_A.weight', 'transformer_blocks.15.ff.net.0.proj.lora_A.weight', 'transformer_blocks.10.ff.net.0.proj.lora_B.weight', 'transformer_blocks.8.attn1.to_v.lora_A.weight', 'transformer_blocks.14.attn1.to_q.lora_A.weight', 'transformer_blocks.22.attn1.to_v.lora_A.weight', 'transformer_blocks.6.attn2.to_q.lora_A.weight', 'transformer_blocks.6.attn1.to_v.lora_A.weight', 'transformer_blocks.2.attn2.to_out.0.lora_B.weight', 'transformer_blocks.19.ff.net.2.lora_A.weight', 'transformer_blocks.25.attn1.to_out.0.lora_A.weight', 'transformer_blocks.2.attn1.to_v.lora_B.weight', 'transformer_blocks.6.attn2.to_out.0.lora_A.weight', 'transformer_blocks.7.attn2.to_k.lora_A.weight', 'transformer_blocks.10.attn2.to_v.lora_A.weight', 'transformer_blocks.24.attn2.to_out.0.lora_A.weight', 'transformer_blocks.1.ff.net.0.proj.lora_B.weight', 'transformer_blocks.4.attn1.to_v.lora_B.weight', 'transformer_blocks.26.attn1.to_q.lora_B.weight', 'transformer_blocks.18.attn2.to_k.lora_B.weight', 'transf
PixArt: LoRA conversion has missing keys! (probably)
frutiemax92 commented 5 months ago

Replaced the get_lora_depth function but the ff and to_out layers are still not handled correctly.

PixArt: LoRA conversion has leftover keys! (350 vs 574)
['transformer_blocks.14.ff.net.0.proj.lora_A.weight', 'transformer_blocks.18.ff.net.2.lora_B.weight', 'transformer_blocks.7.ff.net.0.proj.lora_A.weight', 'transformer_blocks.7.attn1.to_out.0.lora_A.weight', 'transformer_blocks.4.attn1.to_out.0.lora_A.weight', 'transformer_blocks.0.ff.net.0.proj.lora_B.weight', 'transformer_blocks.21.ff.net.2.lora_A.weight', 'transformer_blocks.9.ff.net.0.proj.lora_B.weight', 'transformer_blocks.11.ff.net.2.lora_A.weight', 'transformer_blocks.17.ff.net.2.lora_B.weight', 'transformer_blocks.24.ff.net.2.lora_B.weight', 'transformer_blocks.5.attn1.to_out.0.lora_B.weight', 'transformer_blocks.26.ff.net.2.lora_A.weight', 'transformer_blocks.19.attn1.to_out.0.lora_B.weight', 'transformer_blocks.19.attn2.to_out.0.lora_B.weight', 'transformer_blocks.23.ff.net.2.lora_A.weight', 'transformer_blocks.2.ff.net.0.proj.lora_A.weight', 'transformer_blocks.18.attn1.to_out.0.lora_A.weight', 'transformer_blocks.8.ff.net.2.lora_B.weight', 'transformer_blocks.23.attn1.to_out.0.lora_B.weight', 'transformer_blocks.4.ff.net.0.proj.lora_B.weight', 'transformer_blocks.2.ff.net.2.lora_A.weight', 'transformer_blocks.26.attn1.to_out.0.lora_B.weight', 'transformer_blocks.5.attn1.to_out.0.lora_A.weight', 'transformer_blocks.5.attn2.to_out.0.lora_A.weight', 'transformer_blocks.20.ff.net.0.proj.lora_A.weight', 'transformer_blocks.15.ff.net.2.lora_B.weight', 'transformer_blocks.4.ff.net.2.lora_A.weight', 'transformer_blocks.10.attn1.to_out.0.lora_A.weight', 'transformer_blocks.26.attn1.to_out.0.lora_A.weight', 'transformer_blocks.16.attn2.to_out.0.lora_B.weight', 'transformer_blocks.23.ff.net.0.proj.lora_A.weight', 'transformer_blocks.13.attn2.to_out.0.lora_B.weight', 'transformer_blocks.8.ff.net.0.proj.lora_A.weight', 'transformer_blocks.27.ff.net.2.lora_B.weight', 'transformer_blocks.24.attn1.to_out.0.lora_B.weight', 'transformer_blocks.15.attn2.to_out.0.lora_B.weight', 'transformer_blocks.12.ff.net.0.proj.lora_B.weight', 'transformer_blocks.16.ff.net.0.proj.lora_A.weight', 'transformer_blocks.23.ff.net.0.proj.lora_B.weight', 'transformer_blocks.16.ff.net.2.lora_A.weight', 'transformer_blocks.8.attn1.to_out.0.lora_B.weight', 'transformer_blocks.22.ff.net.2.lora_B.weight', 'transformer_blocks.11.ff.net.0.proj.lora_B.weight', 'transformer_blocks.26.attn2.to_out.0.lora_B.weight', 'transformer_blocks.9.attn1.to_out.0.lora_B.weight', 'transformer_blocks.10.attn2.to_out.0.lora_A.weight', 'transformer_blocks.17.ff.net.0.proj.lora_B.weight', 'transformer_blocks.21.attn1.to_out.0.lora_B.weight', 'transformer_blocks.15.attn2.to_out.0.lora_A.weight', 'transformer_blocks.3.ff.net.2.lora_A.weight', 'transformer_blocks.7.attn1.to_out.0.lora_B.weight', 'transformer_blocks.14.attn2.to_out.0.lora_A.weight', 'transformer_blocks.16.attn1.to_out.0.lora_A.weight', 'transformer_blocks.12.ff.net.0.proj.lora_A.weight', 'transformer_blocks.23.attn1.to_out.0.lora_A.weight', 'transformer_blocks.20.attn1.to_out.0.lora_B.weight', 'transformer_blocks.3.attn2.to_out.0.lora_A.weight', 'transformer_blocks.26.ff.net.0.proj.lora_A.weight', 'transformer_blocks.19.ff.net.2.lora_A.weight', 'transformer_blocks.13.ff.net.2.lora_A.weight', 'transformer_blocks.7.ff.net.2.lora_B.weight', 'transformer_blocks.8.ff.net.0.proj.lora_B.weight', 'transformer_blocks.10.ff.net.0.proj.lora_A.weight', 'transformer_blocks.16.ff.net.0.proj.lora_B.weight', 'transformer_blocks.24.ff.net.0.proj.lora_B.weight', 'transformer_blocks.10.ff.net.0.proj.lora_B.weight', 'transformer_blocks.2.attn1.to_out.0.lora_A.weight', 'transformer_blocks.23.ff.net.2.lora_B.weight', 'transformer_blocks.9.ff.net.0.proj.lora_A.weight', 'transformer_blocks.12.attn2.to_out.0.lora_B.weight', 'transformer_blocks.1.attn1.to_out.0.lora_B.weight', 'transformer_blocks.10.attn1.to_out.0.lora_B.weight', 'transformer_blocks.9.attn2.to_out.0.lora_A.weight', 'transformer_blocks.22.attn1.to_out.0.lora_A.weight', 'transformer_blocks.20.attn2.to_out.0.lora_A.weight', 'transformer_blocks.4.attn2.to_out.0.lora_B.weight', 'transformer_blocks.6.ff.net.0.proj.lora_B.weight', 'transformer_blocks.0.ff.net.2.lora_B.weight', 'transformer_blocks.27.ff.net.0.proj.lora_A.weight', 'transformer_blocks.1.attn2.to_out.0.lora_A.weight', 'transformer_blocks.14.ff.net.2.lora_B.weight', 'transformer_blocks.14.attn1.to_out.0.lora_A.weight', 'transformer_blocks.13.attn2.to_out.0.lora_A.weight', 'transformer_blocks.7.attn2.to_out.0.lora_A.weight', 'transformer_blocks.21.attn2.to_out.0.lora_A.weight', 'transformer_blocks.9.ff.net.2.lora_A.weight', 'transformer_blocks.11.ff.net.0.proj.lora_A.weight', 'transformer_blocks.6.ff.net.2.lora_B.weight', 'transformer_blocks.18.ff.net.0.proj.lora_B.weight', 'transformer_blocks.13.attn1.to_out.0.lora_B.weight', 'transformer_blocks.6.attn2.to_out.0.lora_A.weight', 'transformer_blocks.0.attn1.to_out.0.lora_A.weight', 'transformer_blocks.11.attn2.to_out.0.lora_A.weight', 'transformer_blocks.5.ff.net.2.lora_A.weight', 'transformer_blocks.1.attn1.to_out.0.lora_A.weight', 'transformer_blocks.25.ff.net.0.proj.lora_A.weight', 'transformer_blocks.19.attn1.to_out.0.lora_A.weight', 'transformer_blocks.23.attn2.to_out.0.lora_B.weight', 'transformer_blocks.10.ff.net.2.lora_A.weight', 'transformer_blocks.4.ff.net.2.lora_B.weight', 'transformer_blocks.16.ff.net.2.lora_B.weight', 'transformer_blocks.13.attn1.to_out.0.lora_A.weight', 'transformer_blocks.3.ff.net.0.proj.lora_A.weight', 'transformer_blocks.4.attn2.to_out.0.lora_A.weight', 'transformer_blocks.3.attn2.to_out.0.lora_B.weight', 'transformer_blocks.10.ff.net.2.lora_B.weight', 'transformer_blocks.6.ff.net.0.proj.lora_A.weight', 'transformer_blocks.6.ff.net.2.lora_A.weight', 'transformer_blocks.0.attn2.to_out.0.lora_B.weight', 'transformer_blocks.19.ff.net.0.proj.lora_B.weight', 'transformer_blocks.13.ff.net.0.proj.lora_A.weight', 'transformer_blocks.24.attn2.to_out.0.lora_B.weight', 'transformer_blocks.1.attn2.to_out.0.lora_B.weight', 'transformer_blocks.25.attn2.to_out.0.lora_B.weight', 'transformer_blocks.27.attn2.to_out.0.lora_B.weight', 'transformer_blocks.20.ff.net.2.lora_B.weight', 'transformer_blocks.1.ff.net.0.proj.lora_A.weight', 'transformer_blocks.3.ff.net.0.proj.lora_B.weight', 'transformer_blocks.12.attn1.to_out.0.lora_B.weight', 'transformer_blocks.24.attn1.to_out.0.lora_A.weight', 'transformer_blocks.3.attn1.to_out.0.lora_A.weight', 'transformer_blocks.9.ff.net.2.lora_B.weight', 'transformer_blocks.17.attn1.to_out.0.lora_B.weight', 'transformer_blocks.27.ff.net.0.proj.lora_B.weight', 'transformer_blocks.20.attn2.to_out.0.lora_B.weight', 'transformer_blocks.24.ff.net.0.proj.lora_A.weight', 'transformer_blocks.1.ff.net.0.proj.lora_B.weight', 'transformer_blocks.11.attn1.to_out.0.lora_A.weight', 'transformer_blocks.2.ff.net.0.proj.lora_B.weight', 'transformer_blocks.17.attn2.to_out.0.lora_A.weight', 'transformer_blocks.6.attn1.to_out.0.lora_A.weight', 'transformer_blocks.11.attn1.to_out.0.lora_B.weight', 'transformer_blocks.7.ff.net.0.proj.lora_B.weight', 'transformer_blocks.12.ff.net.2.lora_A.weight', 'transformer_blocks.16.attn1.to_out.0.lora_B.weight', 'transformer_blocks.14.ff.net.0.proj.lora_B.weight', 'transformer_blocks.25.attn1.to_out.0.lora_A.weight', 'transformer_blocks.26.ff.net.2.lora_B.weight', 'transformer_blocks.11.attn2.to_out.0.lora_B.weight', 'transformer_blocks.5.ff.net.0.proj.lora_B.weight', 'transformer_blocks.9.attn2.to_out.0.lora_B.weight', 'transformer_blocks.15.ff.net.2.lora_A.weight', 'transformer_blocks.12.attn1.to_out.0.lora_A.weight', 'transformer_blocks.15.ff.net.0.proj.lora_B.weight', 'transformer_blocks.18.ff.net.2.lora_A.weight', 'transformer_blocks.0.attn2.to_out.0.lora_A.weight', 'transformer_blocks.18.attn2.to_out.0.lora_A.weight', 'transformer_blocks.3.attn1.to_out.0.lora_B.weight', 'transformer_blocks.20.attn1.to_out.0.lora_A.weight', 'transformer_blocks.14.attn2.to_out.0.lora_B.weight', 'transformer_blocks.22.attn1.to_out.0.lora_B.weight', 'transformer_blocks.25.ff.net.0.proj.lora_B.weight', 'transformer_blocks.2.attn2.to_out.0.lora_A.weight', 'transformer_blocks.14.attn1.to_out.0.lora_B.weight', 'transformer_blocks.21.attn1.to_out.0.lora_A.weight', 'transformer_blocks.22.attn2.to_out.0.lora_B.weight', 'transformer_blocks.7.ff.net.2.lora_A.weight', 'transformer_blocks.25.attn1.to_out.0.lora_B.weight', 'transformer_blocks.12.attn2.to_out.0.lora_A.weight', 'transformer_blocks.4.ff.net.0.proj.lora_A.weight', 'transformer_blocks.24.ff.net.2.lora_A.weight', 'transformer_blocks.27.attn2.to_out.0.lora_A.weight', 'transformer_blocks.17.ff.net.0.proj.lora_A.weight', 'transformer_blocks.5.attn2.to_out.0.lora_B.weight', 'transformer_blocks.19.attn2.to_out.0.lora_A.weight', 'transformer_blocks.5.ff.net.2.lora_B.weight', 'transformer_blocks.11.ff.net.2.lora_B.weight', 'transformer_blocks.0.attn1.to_out.0.lora_B.weight', 'transformer_blocks.2.attn1.to_out.0.lora_B.weight', 'transformer_blocks.21.ff.net.2.lora_B.weight', 'transformer_blocks.17.attn2.to_out.0.lora_B.weight', 'transformer_blocks.10.attn2.to_out.0.lora_B.weight', 'transformer_blocks.14.ff.net.2.lora_A.weight', 'transformer_blocks.8.attn2.to_out.0.lora_A.weight', 'transformer_blocks.25.ff.net.2.lora_A.weight', 'transformer_blocks.6.attn1.to_out.0.lora_B.weight', 'transformer_blocks.12.ff.net.2.lora_B.weight', 'transformer_blocks.19.ff.net.0.proj.lora_A.weight', 'transformer_blocks.22.attn2.to_out.0.lora_A.weight', 'transformer_blocks.27.attn1.to_out.0.lora_A.weight', 'transformer_blocks.6.attn2.to_out.0.lora_B.weight', 'transformer_blocks.24.attn2.to_out.0.lora_A.weight', 'transformer_blocks.26.ff.net.0.proj.lora_B.weight', 'transformer_blocks.17.attn1.to_out.0.lora_A.weight', 'transformer_blocks.18.attn2.to_out.0.lora_B.weight', 'transformer_blocks.13.ff.net.0.proj.lora_B.weight', 'transformer_blocks.22.ff.net.2.lora_A.weight', 'transformer_blocks.20.ff.net.2.lora_A.weight', 'transformer_blocks.15.attn1.to_out.0.lora_B.weight', 'transformer_blocks.9.attn1.to_out.0.lora_A.weight', 'transformer_blocks.26.attn2.to_out.0.lora_A.weight', 'transformer_blocks.1.ff.net.2.lora_B.weight', 'transformer_blocks.0.ff.net.2.lora_A.weight', 'transformer_blocks.1.ff.net.2.lora_A.weight', 'transformer_blocks.13.ff.net.2.lora_B.weight', 'transformer_blocks.16.attn2.to_out.0.lora_A.weight', 'transformer_blocks.19.ff.net.2.lora_B.weight', 'transformer_blocks.23.attn2.to_out.0.lora_A.weight', 'transformer_blocks.17.ff.net.2.lora_A.weight', 'transformer_blocks.2.ff.net.2.lora_B.weight', 'transformer_blocks.20.ff.net.0.proj.lora_B.weight', 'transformer_blocks.4.attn1.to_out.0.lora_B.weight', 'transformer_blocks.22.ff.net.0.proj.lora_A.weight', 'transformer_blocks.8.ff.net.2.lora_A.weight', 'transformer_blocks.21.attn2.to_out.0.lora_B.weight', 'transformer_blocks.18.attn1.to_out.0.lora_B.weight', 'transformer_blocks.21.ff.net.0.proj.lora_A.weight', 'transformer_blocks.8.attn1.to_out.0.lora_A.weight', 'transformer_blocks.27.attn1.to_out.0.lora_B.weight', 'transformer_blocks.25.attn2.to_out.0.lora_A.weight', 'transformer_blocks.27.ff.net.2.lora_A.weight', 'transformer_blocks.5.ff.net.0.proj.lora_A.weight', 'transformer_blocks.25.ff.net.2.lora_B.weight', 'transformer_blocks.18.ff.net.0.proj.lora_A.weight', 'transformer_blocks.15.attn1.to_out.0.lora_A.weight', 'transformer_blocks.2.attn2.to_out.0.lora_B.weight', 'transformer_blocks.21.ff.net.0.proj.lora_B.weight', 'transformer_blocks.8.attn2.to_out.0.lora_B.weight', 'transformer_blocks.3.ff.net.2.lora_B.weight', 'transformer_blocks.22.ff.net.0.proj.lora_B.weight', 'transformer_blocks.0.ff.net.0.proj.lora_A.weight', 'transformer_blocks.15.ff.net.0.proj.lora_A.weight', 'transformer_blocks.7.attn2.to_out.0.lora_B.weight'] 
PixArt: LoRA conversion has missing keys! (probably)
frutiemax92 commented 5 months ago

Even with the latest merged PR, I am still getting different results between using the node and merging to a checkpoint. I believe there are some missing layers/weights still!

frutiemax92 commented 4 months ago

Here is a lora example I just published: https://civitai.com/models/578802?modelVersionId=645548

I've seen that I need to crank the lora strength quite a bit i.e. up to 1.8 to get good results. Is there anything in the patching algorithm that needs quite a boost in the strength?