Closed 0xmihutao closed 8 months ago
Getting this issue when including the v3 sd15 adapter lora by adding to the prompt
raise AssertionError(f"Could not find a module type (out of {', '.join([x.__class__.__name__ for x in module_types])}) that would accept those keys: {', '.join(weights.w)}") AssertionError: Could not find a module type (out of ModuleTypeLora, ModuleTypeHada, ModuleTypeIa3, ModuleTypeLokr, ModuleTypeFull, ModuleTypeNorm, ModuleTypeGLora, ModuleTypeOFT) that would accept those keys: 0.attentions.0.transformer_blocks.0.attn1.processor.to_q_lora.down.weight, 0.attentions.0.transformer_blocks.0.attn1.processor.to_q_lora.up.weight, 0.attentions.0.transformer_blocks.0.attn1.processor.to_k_lora.down.weight, 0.attentions.0.transformer_blocks.0.attn1.processor.to_k_lora.up.weight, 0.attentions.0.transformer_blocks.0.attn1.processor.to_v_lora.down.weight, 0.attentions.0.transformer_blocks.0.attn1.processor.to_v_lora.up.weight, 0.attentions.0.transformer_blocks.0.attn1.processor.to_out_lora.down.weight, 0.attentions.0.transformer_blocks.0.attn1.processor.to_out_lora.up.weight, 0.attentions.0.transformer_blocks.0.attn2.processor.to_q_lora.down.weight, 0.attentions.0.transformer_blocks.0.attn2.processor.to_q_lora.up.weight, 0.attentions.0.transformer_blocks.0.attn2.processor.to_k_lora.down.weight, 0.attentions.0.transformer_blocks.0.attn2.processor.to_k_lora.up.weight, 0.attentions.0.transformer_blocks.0.attn2.processor.to_v_lora.down.weight, 0.attentions.0.transformer_blocks.0.attn2.processor.to_v_lora.up.weight, 0.attentions.0.transformer_blocks.0.attn2.processor.to_out_lora.down.weight, 0.attentions.0.transformer_blocks.0.attn2.processor.to_out_lora.up.weight, 0.attentions.1.transformer_blocks.0.attn1.processor.to_q_lora.down.weight, 0.attentions.1.transformer_blocks.0.attn1.processor.to_q_lora.up.weight, 0.attentions.1.transformer_blocks.0.attn1.processor.to_k_lora.down.weight, 0.attentions.1.transformer_blocks.0.attn1.processor.to_k_lora.up.weight, 0.attentions.1.transformer_blocks.0.attn1.processor.to_v_lora.down.weight, 0.attentions.1.transformer_blocks.0.attn1.processor.to_v_lora.up.weight, 0.attentions.1.transformer_blocks.0.attn1.processor.to_out_lora.down.weight, 0.attentions.1.transformer_blocks.0.attn1.processor.to_out_lora.up.weight, 0.attentions.1.transformer_blocks.0.attn2.processor.to_q_lora.down.weight, 0.attentions.1.transformer_blocks.0.attn2.processor.to_q_lora.up.weight, 0.attentions.1.transformer_blocks.0.attn2.processor.to_k_lora.down.weight, 0.attentions.1.transformer_blocks.0.attn2.processor.to_k_lora.up.weight, 0.attentions.1.transformer_blocks.0.attn2.processor.to_v_lora.down.weight, 0.attentions.1.transformer_blocks.0.attn2.processor.to_v_lora.up.weight, 0.attentions.1.transformer_blocks.0.attn2.processor.to_out_lora.down.weight, 0.attentions.1.transformer_blocks.0.attn2.processor.to_out_lora.up.weight, 1.attentions.0.transformer_blocks.0.attn1.processor.to_q_lora.down.weight, 1.attentions.0.transformer_blocks.0.attn1.processor.to_q_lora.up.weight, 1.attentions.0.transformer_blocks.0.attn1.processor.to_k_lora.down.weight, 1.attentions.0.transformer_blocks.0.attn1.processor.to_k_lora.up.weight, 1.attentions.0.transformer_blocks.0.attn1.processor.to_v_lora.down.weight, 1.attentions.0.transformer_blocks.0.attn1.processor.to_v_lora.up.weight, 1.attentions.0.transformer_blocks.0.attn1.processor.to_out_lora.down.weight, 1.attentions.0.transformer_blocks.0.attn1.processor.to_out_lora.up.weight, 1.attentions.0.transformer_blocks.0.attn2.processor.to_q_lora.down.weight, 1.attentions.0.transformer_blocks.0.attn2.processor.to_q_lora.up.weight, 1.attentions.0.transformer_blocks.0.attn2.processor.to_k_lora.down.weight, 1.attentions.0.transformer_blocks.0.attn2.processor.to_k_lora.up.weight, 1.attentions.0.transformer_blocks.0.attn2.processor.to_v_lora.down.weight, 1.attentions.0.transformer_blocks.0.attn2.processor.to_v_lora.up.weight, 1.attentions.0.transformer_blocks.0.attn2.processor.to_out_lora.down.weight, 1.attentions.0.transformer_blocks.0.attn2.processor.to_out_lora.up.weight, 1.attentions.1.transformer_blocks.0.attn1.processor.to_q_lora.down.weight, 1.attentions.1.transformer_blocks.0.attn1.processor.to_q_lora.up.weight, 1.attentions.1.transformer_blocks.0.attn1.processor.to_k_lora.down.weight, 1.attentions.1.transformer_blocks.0.attn1.processor.to_k_lora.up.weight, 1.attentions.1.transformer_blocks.0.attn1.processor.to_v_lora.down.weight, 1.attentions.1.transformer_blocks.0.attn1.processor.to_v_lora.up.weight, 1.attentions.1.transformer_blocks.0.attn1.processor.to_out_lora.down.weight, 1.attentions.1.transformer_blocks.0.attn1.processor.to_out_lora.up.weight, 1.attentions.1.transformer_blocks.0.attn2.processor.to_q_lora.down.weight, 1.attentions.1.transformer_blocks.0.attn2.processor.to_q_lora.up.weight, 1.attentions.1.transformer_blocks.0.attn2.processor.to_k_lora.down.weight, 1.attentions.1.transformer_blocks.0.attn2.processor.to_k_lora.up.weight, 1.attentions.1.transformer_blocks.0.attn2.processor.to_v_lora.down.weight, 1.attentions.1.transformer_blocks.0.attn2.processor.to_v_lora.up.weight, 1.attentions.1.transformer_blocks.0.attn2.processor.to_out_lora.down.weight, 1.attentions.1.transformer_blocks.0.attn2.processor.to_out_lora.up.weight, 2.attentions.0.transformer_blocks.0.attn1.processor.to_q_lora.down.weight, 2.attentions.0.transformer_blocks.0.attn1.processor.to_q_lora.up.weight, 2.attentions.0.transformer_blocks.0.attn1.processor.to_k_lora.down.weight, 2.attentions.0.transformer_blocks.0.attn1.processor.to_k_lora.up.weight, 2.attentions.0.transformer_blocks.0.attn1.processor.to_v_lora.down.weight, 2.attentions.0.transformer_blocks.0.attn1.processor.to_v_lora.up.weight, 2.attentions.0.transformer_blocks.0.attn1.processor.to_out_lora.down.weight, 2.attentions.0.transformer_blocks.0.attn1.processor.to_out_lora.up.weight, 2.attentions.0.transformer_blocks.0.attn2.processor.to_q_lora.down.weight, 2.attentions.0.transformer_blocks.0.attn2.processor.to_q_lora.up.weight, 2.attentions.0.transformer_blocks.0.attn2.processor.to_k_lora.down.weight, 2.attentions.0.transformer_blocks.0.attn2.processor.to_k_lora.up.weight, 2.attentions.0.transformer_blocks.0.attn2.processor.to_v_lora.down.weight, 2.attentions.0.transformer_blocks.0.attn2.processor.to_v_lora.up.weight, 2.attentions.0.transformer_blocks.0.attn2.processor.to_out_lora.down.weight, 2.attentions.0.transformer_blocks.0.attn2.processor.to_out_lora.up.weight, 2.attentions.1.transformer_blocks.0.attn1.processor.to_q_lora.down.weight, 2.attentions.1.transformer_blocks.0.attn1.processor.to_q_lora.up.weight, 2.attentions.1.transformer_blocks.0.attn1.processor.to_k_lora.down.weight, 2.attentions.1.transformer_blocks.0.attn1.processor.to_k_lora.up.weight, 2.attentions.1.transformer_blocks.0.attn1.processor.to_v_lora.down.weight, 2.attentions.1.transformer_blocks.0.attn1.processor.to_v_lora.up.weight, 2.attentions.1.transformer_blocks.0.attn1.processor.to_out_lora.down.weight, 2.attentions.1.transformer_blocks.0.attn1.processor.to_out_lora.up.weight, 2.attentions.1.transformer_blocks.0.attn2.processor.to_q_lora.down.weight, 2.attentions.1.transformer_blocks.0.attn2.processor.to_q_lora.up.weight, 2.attentions.1.transformer_blocks.0.attn2.processor.to_k_lora.down.weight, 2.attentions.1.transformer_blocks.0.attn2.processor.to_k_lora.up.weight, 2.attentions.1.transformer_blocks.0.attn2.processor.to_v_lora.down.weight, 2.attentions.1.transformer_blocks.0.attn2.processor.to_v_lora.up.weight, 2.attentions.1.transformer_blocks.0.attn2.processor.to_out_lora.down.weight, 2.attentions.1.transformer_blocks.0.attn2.processor.to_out_lora.up.weight
add to the prompt. Then use v3_sd15_mm.ckpt motion module in animate diff settings
Should not have raised this assertion error
webui: cf2772fab0af5573da775e7437e6acdca424f26e extension: 4c3ec2e5
Google Chrome
./webui.sh -f --listen --ckpt-dir models/Stable-diffusion --vae-dir models/VAE --embeddings-dir embeddings --lora-dir models/Lora --share --enable-insecure-extension-access --xformers --no-half-vae
### Additional information _No response_
use my hf repo. don't use the original
Is there an existing issue for this?
Have you read FAQ on README?
What happened?
Getting this issue when including the v3 sd15 adapter lora by adding to the prompt
Steps to reproduce the problem
add to the prompt. Then use v3_sd15_mm.ckpt motion module in animate diff settings
What should have happened?
Should not have raised this assertion error
Commit where the problem happens
webui: cf2772fab0af5573da775e7437e6acdca424f26e extension: 4c3ec2e5
What browsers do you use to access the UI ?
Google Chrome
Command Line Arguments
Console logs