Mikubill / sd-webui-controlnet

WebUI extension for ControlNet
GNU General Public License v3.0
17.04k stars 1.96k forks source link

Feature request: support for new T2i Style Adapter #475

Closed AugmentedRealityCat closed 1 year ago

AugmentedRealityCat commented 1 year ago

New T2i models have been released, and one of them is the T2i Style Adapter.

222734169-d47789e8-e83c-48c2-80ef-a896c2bafbb0

I've downloaded the model over here: https://huggingface.co/TencentARC/T2I-Adapter/tree/main/models

I've installed it with the other T2i and ControlNet models in stable-diffusion-webui-master\extensions\sd-webui-controlnet\models

I don't know if it requires a pre-processor but it does seem to interact with Embeddings and that might explain why it's not working at the moment. Here is the error log:

To create a public link, set `share=True` in `launch()`.
Loading model: t2iadapter_style_sd14v1 [202e85cc]
Loaded state_dict from [C:\stable-diffusion-webui-master\extensions\sd-webui-controlnet\models\t2iadapter_style_sd14v1.pth]
Error running process: C:\stable-diffusion-webui-master\extensions\sd-webui-controlnet\scripts\controlnet.py
Traceback (most recent call last):
  File "C:\stable-diffusion-webui-master\modules\scripts.py", line 386, in process
    script.process(p, *script_args)
  File "C:\stable-diffusion-webui-master\extensions\sd-webui-controlnet\scripts\controlnet.py", line 608, in process
    else self.build_control_model(p, unet, model, lowvram)
  File "C:\stable-diffusion-webui-master\extensions\sd-webui-controlnet\scripts\controlnet.py", line 467, in build_control_model
    network = network_module(
  File "C:\stable-diffusion-webui-master\extensions\sd-webui-controlnet\scripts\cldm.py", line 107, in __init__
    self.control_model.load_state_dict(state_dict)
  File "C:\stable-diffusion-webui-master\venv\lib\site-packages\torch\nn\modules\module.py", line 1604, in load_state_dict
    raise RuntimeError('Error(s) in loading state_dict for {}:\n\t{}'.format(
RuntimeError: Error(s) in loading state_dict for ControlNet:
        Missing key(s) in state_dict: "time_embed.0.weight", "time_embed.0.bias", "time_embed.2.weight", "time_embed.2.bias", "input_blocks.0.0.weight", "input_blocks.0.0.bias", "input_blocks.1.0.in_layers.0.weight", "input_blocks.1.0.in_layers.0.bias", "input_blocks.1.0.in_layers.2.weight", "input_blocks.1.0.in_layers.2.bias", "input_blocks.1.0.emb_layers.1.weight", "input_blocks.1.0.emb_layers.1.bias", "input_blocks.1.0.out_layers.0.weight", "input_blocks.1.0.out_layers.0.bias", "input_blocks.1.0.out_layers.3.weight", "input_blocks.1.0.out_layers.3.bias", "input_blocks.1.1.norm.weight", "input_blocks.1.1.norm.bias", "input_blocks.1.1.proj_in.weight", "input_blocks.1.1.proj_in.bias", "input_blocks.1.1.transformer_blocks.0.attn1.to_q.weight", "input_blocks.1.1.transformer_blocks.0.attn1.to_k.weight", "input_blocks.1.1.transformer_blocks.0.attn1.to_v.weight", "input_blocks.1.1.transformer_blocks.0.attn1.to_out.0.weight", "input_blocks.1.1.transformer_blocks.0.attn1.to_out.0.bias", "input_blocks.1.1.transformer_blocks.0.ff.net.0.proj.weight", "input_blocks.1.1.transformer_blocks.0.ff.net.0.proj.bias", "input_blocks.1.1.transformer_blocks.0.ff.net.2.weight", "input_blocks.1.1.transformer_blocks.0.ff.net.2.bias", "input_blocks.1.1.transformer_blocks.0.attn2.to_q.weight", "input_blocks.1.1.transformer_blocks.0.attn2.to_k.weight", "input_blocks.1.1.transformer_blocks.0.attn2.to_v.weight", "input_blocks.1.1.transformer_blocks.0.attn2.to_out.0.weight", "input_blocks.1.1.transformer_blocks.0.attn2.to_out.0.bias", "input_blocks.1.1.transformer_blocks.0.norm1.weight", "input_blocks.1.1.transformer_blocks.0.norm1.bias", "input_blocks.1.1.transformer_blocks.0.norm2.weight", "input_blocks.1.1.transformer_blocks.0.norm2.bias", "input_blocks.1.1.transformer_blocks.0.norm3.weight", "input_blocks.1.1.transformer_blocks.0.norm3.bias", "input_blocks.1.1.proj_out.weight", "input_blocks.1.1.proj_out.bias", "input_blocks.2.0.in_layers.0.weight", "input_blocks.2.0.in_layers.0.bias", "input_blocks.2.0.in_layers.2.weight", "input_blocks.2.0.in_layers.2.bias", "input_blocks.2.0.emb_layers.1.weight", "input_blocks.2.0.emb_layers.1.bias", "input_blocks.2.0.out_layers.0.weight", "input_blocks.2.0.out_layers.0.bias", "input_blocks.2.0.out_layers.3.weight", "input_blocks.2.0.out_layers.3.bias", "input_blocks.2.1.norm.weight", "input_blocks.2.1.norm.bias", "input_blocks.2.1.proj_in.weight", "input_blocks.2.1.proj_in.bias", "input_blocks.2.1.transformer_blocks.0.attn1.to_q.weight", "input_blocks.2.1.transformer_blocks.0.attn1.to_k.weight", "input_blocks.2.1.transformer_blocks.0.attn1.to_v.weight", "input_blocks.2.1.transformer_blocks.0.attn1.to_out.0.weight", "input_blocks.2.1.transformer_blocks.0.attn1.to_out.0.bias", "input_blocks.2.1.transformer_blocks.0.ff.net.0.proj.weight", "input_blocks.2.1.transformer_blocks.0.ff.net.0.proj.bias", "input_blocks.2.1.transformer_blocks.0.ff.net.2.weight", "input_blocks.2.1.transformer_blocks.0.ff.net.2.bias", "input_blocks.2.1.transformer_blocks.0.attn2.to_q.weight", "input_blocks.2.1.transformer_blocks.0.attn2.to_k.weight", "input_blocks.2.1.transformer_blocks.0.attn2.to_v.weight", "input_blocks.2.1.transformer_blocks.0.attn2.to_out.0.weight", "input_blocks.2.1.transformer_blocks.0.attn2.to_out.0.bias", "input_blocks.2.1.transformer_blocks.0.norm1.weight", "input_blocks.2.1.transformer_blocks.0.norm1.bias", "input_blocks.2.1.transformer_blocks.0.norm2.weight", "input_blocks.2.1.transformer_blocks.0.norm2.bias", "input_blocks.2.1.transformer_blocks.0.norm3.weight", "input_blocks.2.1.transformer_blocks.0.norm3.bias", "input_blocks.2.1.proj_out.weight", "input_blocks.2.1.proj_out.bias", "input_blocks.3.0.op.weight", "input_blocks.3.0.op.bias", "input_blocks.4.0.in_layers.0.weight", "input_blocks.4.0.in_layers.0.bias", "input_blocks.4.0.in_layers.2.weight", "input_blocks.4.0.in_layers.2.bias", "input_blocks.4.0.emb_layers.1.weight", "input_blocks.4.0.emb_layers.1.bias", "input_blocks.4.0.out_layers.0.weight", "input_blocks.4.0.out_layers.0.bias", "input_blocks.4.0.out_layers.3.weight", "input_blocks.4.0.out_layers.3.bias", "input_blocks.4.0.skip_connection.weight", "input_blocks.4.0.skip_connection.bias", "input_blocks.4.1.norm.weight", "input_blocks.4.1.norm.bias", "input_blocks.4.1.proj_in.weight", "input_blocks.4.1.proj_in.bias", "input_blocks.4.1.transformer_blocks.0.attn1.to_q.weight", "input_blocks.4.1.transformer_blocks.0.attn1.to_k.weight", "input_blocks.4.1.transformer_blocks.0.attn1.to_v.weight", "input_blocks.4.1.transformer_blocks.0.attn1.to_out.0.weight", "input_blocks.4.1.transformer_blocks.0.attn1.to_out.0.bias", "input_blocks.4.1.transformer_blocks.0.ff.net.0.proj.weight", "input_blocks.4.1.transformer_blocks.0.ff.net.0.proj.bias", "input_blocks.4.1.transformer_blocks.0.ff.net.2.weight", "input_blocks.4.1.transformer_blocks.0.ff.net.2.bias", "input_blocks.4.1.transformer_blocks.0.attn2.to_q.weight", "input_blocks.4.1.transformer_blocks.0.attn2.to_k.weight", "input_blocks.4.1.transformer_blocks.0.attn2.to_v.weight", "input_blocks.4.1.transformer_blocks.0.attn2.to_out.0.weight", "input_blocks.4.1.transformer_blocks.0.attn2.to_out.0.bias", "input_blocks.4.1.transformer_blocks.0.norm1.weight", "input_blocks.4.1.transformer_blocks.0.norm1.bias", "input_blocks.4.1.transformer_blocks.0.norm2.weight", "input_blocks.4.1.transformer_blocks.0.norm2.bias", "input_blocks.4.1.transformer_blocks.0.norm3.weight", "input_blocks.4.1.transformer_blocks.0.norm3.bias", "input_blocks.4.1.proj_out.weight", "input_blocks.4.1.proj_out.bias", "input_blocks.5.0.in_layers.0.weight", "input_blocks.5.0.in_layers.0.bias", "input_blocks.5.0.in_layers.2.weight", "input_blocks.5.0.in_layers.2.bias", "input_blocks.5.0.emb_layers.1.weight", "input_blocks.5.0.emb_layers.1.bias", "input_blocks.5.0.out_layers.0.weight", "input_blocks.5.0.out_layers.0.bias", "input_blocks.5.0.out_layers.3.weight", "input_blocks.5.0.out_layers.3.bias", "input_blocks.5.1.norm.weight", "input_blocks.5.1.norm.bias", "input_blocks.5.1.proj_in.weight", "input_blocks.5.1.proj_in.bias", "input_blocks.5.1.transformer_blocks.0.attn1.to_q.weight", "input_blocks.5.1.transformer_blocks.0.attn1.to_k.weight", "input_blocks.5.1.transformer_blocks.0.attn1.to_v.weight", "input_blocks.5.1.transformer_blocks.0.attn1.to_out.0.weight", "input_blocks.5.1.transformer_blocks.0.attn1.to_out.0.bias", "input_blocks.5.1.transformer_blocks.0.ff.net.0.proj.weight", "input_blocks.5.1.transformer_blocks.0.ff.net.0.proj.bias", "input_blocks.5.1.transformer_blocks.0.ff.net.2.weight", "input_blocks.5.1.transformer_blocks.0.ff.net.2.bias", "input_blocks.5.1.transformer_blocks.0.attn2.to_q.weight", "input_blocks.5.1.transformer_blocks.0.attn2.to_k.weight", "input_blocks.5.1.transformer_blocks.0.attn2.to_v.weight", "input_blocks.5.1.transformer_blocks.0.attn2.to_out.0.weight", "input_blocks.5.1.transformer_blocks.0.attn2.to_out.0.bias", "input_blocks.5.1.transformer_blocks.0.norm1.weight", "input_blocks.5.1.transformer_blocks.0.norm1.bias", "input_blocks.5.1.transformer_blocks.0.norm2.weight", "input_blocks.5.1.transformer_blocks.0.norm2.bias", "input_blocks.5.1.transformer_blocks.0.norm3.weight", "input_blocks.5.1.transformer_blocks.0.norm3.bias", "input_blocks.5.1.proj_out.weight", "input_blocks.5.1.proj_out.bias", "input_blocks.6.0.op.weight", "input_blocks.6.0.op.bias", "input_blocks.7.0.in_layers.0.weight", "input_blocks.7.0.in_layers.0.bias", "input_blocks.7.0.in_layers.2.weight", "input_blocks.7.0.in_layers.2.bias", "input_blocks.7.0.emb_layers.1.weight", "input_blocks.7.0.emb_layers.1.bias", "input_blocks.7.0.out_layers.0.weight", "input_blocks.7.0.out_layers.0.bias", "input_blocks.7.0.out_layers.3.weight", "input_blocks.7.0.out_layers.3.bias", "input_blocks.7.0.skip_connection.weight", "input_blocks.7.0.skip_connection.bias", "input_blocks.7.1.norm.weight", "input_blocks.7.1.norm.bias", "input_blocks.7.1.proj_in.weight", "input_blocks.7.1.proj_in.bias", "input_blocks.7.1.transformer_blocks.0.attn1.to_q.weight", "input_blocks.7.1.transformer_blocks.0.attn1.to_k.weight", "input_blocks.7.1.transformer_blocks.0.attn1.to_v.weight", "input_blocks.7.1.transformer_blocks.0.attn1.to_out.0.weight", "input_blocks.7.1.transformer_blocks.0.attn1.to_out.0.bias", "input_blocks.7.1.transformer_blocks.0.ff.net.0.proj.weight", "input_blocks.7.1.transformer_blocks.0.ff.net.0.proj.bias", "input_blocks.7.1.transformer_blocks.0.ff.net.2.weight", "input_blocks.7.1.transformer_blocks.0.ff.net.2.bias", "input_blocks.7.1.transformer_blocks.0.attn2.to_q.weight", "input_blocks.7.1.transformer_blocks.0.attn2.to_k.weight", "input_blocks.7.1.transformer_blocks.0.attn2.to_v.weight", "input_blocks.7.1.transformer_blocks.0.attn2.to_out.0.weight", "input_blocks.7.1.transformer_blocks.0.attn2.to_out.0.bias", "input_blocks.7.1.transformer_blocks.0.norm1.weight", "input_blocks.7.1.transformer_blocks.0.norm1.bias", "input_blocks.7.1.transformer_blocks.0.norm2.weight", "input_blocks.7.1.transformer_blocks.0.norm2.bias", "input_blocks.7.1.transformer_blocks.0.norm3.weight", "input_blocks.7.1.transformer_blocks.0.norm3.bias", "input_blocks.7.1.proj_out.weight", "input_blocks.7.1.proj_out.bias", "input_blocks.8.0.in_layers.0.weight", "input_blocks.8.0.in_layers.0.bias", "input_blocks.8.0.in_layers.2.weight", "input_blocks.8.0.in_layers.2.bias", "input_blocks.8.0.emb_layers.1.weight", "input_blocks.8.0.emb_layers.1.bias", "input_blocks.8.0.out_layers.0.weight", "input_blocks.8.0.out_layers.0.bias", "input_blocks.8.0.out_layers.3.weight", "input_blocks.8.0.out_layers.3.bias", "input_blocks.8.1.norm.weight", "input_blocks.8.1.norm.bias", "input_blocks.8.1.proj_in.weight", "input_blocks.8.1.proj_in.bias", "input_blocks.8.1.transformer_blocks.0.attn1.to_q.weight", "input_blocks.8.1.transformer_blocks.0.attn1.to_k.weight", "input_blocks.8.1.transformer_blocks.0.attn1.to_v.weight", "input_blocks.8.1.transformer_blocks.0.attn1.to_out.0.weight", "input_blocks.8.1.transformer_blocks.0.attn1.to_out.0.bias", "input_blocks.8.1.transformer_blocks.0.ff.net.0.proj.weight", "input_blocks.8.1.transformer_blocks.0.ff.net.0.proj.bias", "input_blocks.8.1.transformer_blocks.0.ff.net.2.weight", "input_blocks.8.1.transformer_blocks.0.ff.net.2.bias", "input_blocks.8.1.transformer_blocks.0.attn2.to_q.weight", "input_blocks.8.1.transformer_blocks.0.attn2.to_k.weight", "input_blocks.8.1.transformer_blocks.0.attn2.to_v.weight", "input_blocks.8.1.transformer_blocks.0.attn2.to_out.0.weight", "input_blocks.8.1.transformer_blocks.0.attn2.to_out.0.bias", "input_blocks.8.1.transformer_blocks.0.norm1.weight", "input_blocks.8.1.transformer_blocks.0.norm1.bias", "input_blocks.8.1.transformer_blocks.0.norm2.weight", "input_blocks.8.1.transformer_blocks.0.norm2.bias", "input_blocks.8.1.transformer_blocks.0.norm3.weight", "input_blocks.8.1.transformer_blocks.0.norm3.bias", "input_blocks.8.1.proj_out.weight", "input_blocks.8.1.proj_out.bias", "input_blocks.9.0.op.weight", "input_blocks.9.0.op.bias", "input_blocks.10.0.in_layers.0.weight", "input_blocks.10.0.in_layers.0.bias", "input_blocks.10.0.in_layers.2.weight", "input_blocks.10.0.in_layers.2.bias", "input_blocks.10.0.emb_layers.1.weight", "input_blocks.10.0.emb_layers.1.bias", "input_blocks.10.0.out_layers.0.weight", "input_blocks.10.0.out_layers.0.bias", "input_blocks.10.0.out_layers.3.weight", "input_blocks.10.0.out_layers.3.bias", "input_blocks.11.0.in_layers.0.weight", "input_blocks.11.0.in_layers.0.bias", "input_blocks.11.0.in_layers.2.weight", "input_blocks.11.0.in_layers.2.bias", "input_blocks.11.0.emb_layers.1.weight", "input_blocks.11.0.emb_layers.1.bias", "input_blocks.11.0.out_layers.0.weight", "input_blocks.11.0.out_layers.0.bias", "input_blocks.11.0.out_layers.3.weight", "input_blocks.11.0.out_layers.3.bias", "zero_convs.0.0.weight", "zero_convs.0.0.bias", "zero_convs.1.0.weight", "zero_convs.1.0.bias", "zero_convs.2.0.weight", "zero_convs.2.0.bias", "zero_convs.3.0.weight", "zero_convs.3.0.bias", "zero_convs.4.0.weight", "zero_convs.4.0.bias", "zero_convs.5.0.weight", "zero_convs.5.0.bias", "zero_convs.6.0.weight", "zero_convs.6.0.bias", "zero_convs.7.0.weight", "zero_convs.7.0.bias", "zero_convs.8.0.weight", "zero_convs.8.0.bias", "zero_convs.9.0.weight", "zero_convs.9.0.bias", "zero_convs.10.0.weight", "zero_convs.10.0.bias", "zero_convs.11.0.weight", "zero_convs.11.0.bias", "input_hint_block.0.weight", "input_hint_block.0.bias", "input_hint_block.2.weight", "input_hint_block.2.bias", "input_hint_block.4.weight", "input_hint_block.4.bias", "input_hint_block.6.weight", "input_hint_block.6.bias", "input_hint_block.8.weight", "input_hint_block.8.bias", "input_hint_block.10.weight", "input_hint_block.10.bias", "input_hint_block.12.weight", "input_hint_block.12.bias", "input_hint_block.14.weight", "input_hint_block.14.bias", "middle_block.0.in_layers.0.weight", "middle_block.0.in_layers.0.bias", "middle_block.0.in_layers.2.weight", "middle_block.0.in_layers.2.bias", "middle_block.0.emb_layers.1.weight", "middle_block.0.emb_layers.1.bias", "middle_block.0.out_layers.0.weight", "middle_block.0.out_layers.0.bias", "middle_block.0.out_layers.3.weight", "middle_block.0.out_layers.3.bias", "middle_block.1.norm.weight", "middle_block.1.norm.bias", "middle_block.1.proj_in.weight", "middle_block.1.proj_in.bias", "middle_block.1.transformer_blocks.0.attn1.to_q.weight", "middle_block.1.transformer_blocks.0.attn1.to_k.weight", "middle_block.1.transformer_blocks.0.attn1.to_v.weight", "middle_block.1.transformer_blocks.0.attn1.to_out.0.weight", "middle_block.1.transformer_blocks.0.attn1.to_out.0.bias", "middle_block.1.transformer_blocks.0.ff.net.0.proj.weight", "middle_block.1.transformer_blocks.0.ff.net.0.proj.bias", "middle_block.1.transformer_blocks.0.ff.net.2.weight", "middle_block.1.transformer_blocks.0.ff.net.2.bias", "middle_block.1.transformer_blocks.0.attn2.to_q.weight", "middle_block.1.transformer_blocks.0.attn2.to_k.weight", "middle_block.1.transformer_blocks.0.attn2.to_v.weight", "middle_block.1.transformer_blocks.0.attn2.to_out.0.weight", "middle_block.1.transformer_blocks.0.attn2.to_out.0.bias", "middle_block.1.transformer_blocks.0.norm1.weight", "middle_block.1.transformer_blocks.0.norm1.bias", "middle_block.1.transformer_blocks.0.norm2.weight", "middle_block.1.transformer_blocks.0.norm2.bias", "middle_block.1.transformer_blocks.0.norm3.weight", "middle_block.1.transformer_blocks.0.norm3.bias", "middle_block.1.proj_out.weight", "middle_block.1.proj_out.bias", "middle_block.2.in_layers.0.weight", "middle_block.2.in_layers.0.bias", "middle_block.2.in_layers.2.weight", "middle_block.2.in_layers.2.bias", "middle_block.2.emb_layers.1.weight", "middle_block.2.emb_layers.1.bias", "middle_block.2.out_layers.0.weight", "middle_block.2.out_layers.0.bias", "middle_block.2.out_layers.3.weight", "middle_block.2.out_layers.3.bias", "middle_block_out.0.weight", "middle_block_out.0.bias".
        Unexpected key(s) in state_dict: "style_embedding", "proj", "transformer_layes.0.attn.in_proj_weight", "transformer_layes.0.attn.in_proj_bias", "transformer_layes.0.attn.out_proj.weight", "transformer_layes.0.attn.out_proj.bias", "transformer_layes.0.ln_1.weight", "transformer_layes.0.ln_1.bias", "transformer_layes.0.mlp.c_fc.weight", "transformer_layes.0.mlp.c_fc.bias", "transformer_layes.0.mlp.c_proj.weight", "transformer_layes.0.mlp.c_proj.bias", "transformer_layes.0.ln_2.weight", "transformer_layes.0.ln_2.bias", "transformer_layes.1.attn.in_proj_weight", "transformer_layes.1.attn.in_proj_bias", "transformer_layes.1.attn.out_proj.weight", "transformer_layes.1.attn.out_proj.bias", "transformer_layes.1.ln_1.weight", "transformer_layes.1.ln_1.bias", "transformer_layes.1.mlp.c_fc.weight", "transformer_layes.1.mlp.c_fc.bias", "transformer_layes.1.mlp.c_proj.weight", "transformer_layes.1.mlp.c_proj.bias", "transformer_layes.1.ln_2.weight", "transformer_layes.1.ln_2.bias", "transformer_layes.2.attn.in_proj_weight", "transformer_layes.2.attn.in_proj_bias", "transformer_layes.2.attn.out_proj.weight", "transformer_layes.2.attn.out_proj.bias", "transformer_layes.2.ln_1.weight", "transformer_layes.2.ln_1.bias", "transformer_layes.2.mlp.c_fc.weight", "transformer_layes.2.mlp.c_fc.bias", "transformer_layes.2.mlp.c_proj.weight", "transformer_layes.2.mlp.c_proj.bias", "transformer_layes.2.ln_2.weight", "transformer_layes.2.ln_2.bias", "ln_post.weight", "ln_post.bias", "ln_pre.weight", "ln_pre.bias".

100%|██████████████████████████████████████████████████████████████████████████████████| 20/20 [00:05<00:00,  3.75it/s]
Loading CLiP model ViT-L/14 ███████████████████████████████████████████████████████████| 20/20 [00:02<00:00,  9.38it/s]
100%|███████████████████████████████████████| 890M/890M [00:56<00:00, 16.4MiB/s]
Aesthetic scorer error: Model has been downloaded but the SHA256 checksum does not not match
Total progress: 100%|██████████████████████████████████████████████████████████████████| 20/20 [01:06<00:00,  3.34s/it]

And I ran it a second time just to make sure - the log seems pretty much identical.

Loading model: t2iadapter_style_sd14v1 [202e85cc]
Loaded state_dict from [C:\stable-diffusion-webui-master\extensions\sd-webui-controlnet\models\t2iadapter_style_sd14v1.pth]
Error running process: C:\stable-diffusion-webui-master\extensions\sd-webui-controlnet\scripts\controlnet.py
Traceback (most recent call last):
  File "C:\stable-diffusion-webui-master\modules\scripts.py", line 386, in process
    script.process(p, *script_args)
  File "C:\stable-diffusion-webui-master\extensions\sd-webui-controlnet\scripts\controlnet.py", line 608, in process
    else self.build_control_model(p, unet, model, lowvram)
  File "C:\stable-diffusion-webui-master\extensions\sd-webui-controlnet\scripts\controlnet.py", line 467, in build_control_model
    network = network_module(
  File "C:\stable-diffusion-webui-master\extensions\sd-webui-controlnet\scripts\cldm.py", line 107, in __init__
    self.control_model.load_state_dict(state_dict)
  File "C:\stable-diffusion-webui-master\venv\lib\site-packages\torch\nn\modules\module.py", line 1604, in load_state_dict
    raise RuntimeError('Error(s) in loading state_dict for {}:\n\t{}'.format(
RuntimeError: Error(s) in loading state_dict for ControlNet:
        Missing key(s) in state_dict: "time_embed.0.weight", "time_embed.0.bias", "time_embed.2.weight", "time_embed.2.bias", "input_blocks.0.0.weight", "input_blocks.0.0.bias", "input_blocks.1.0.in_layers.0.weight", "input_blocks.1.0.in_layers.0.bias", "input_blocks.1.0.in_layers.2.weight", "input_blocks.1.0.in_layers.2.bias", "input_blocks.1.0.emb_layers.1.weight", "input_blocks.1.0.emb_layers.1.bias", "input_blocks.1.0.out_layers.0.weight", "input_blocks.1.0.out_layers.0.bias", "input_blocks.1.0.out_layers.3.weight", "input_blocks.1.0.out_layers.3.bias", "input_blocks.1.1.norm.weight", "input_blocks.1.1.norm.bias", "input_blocks.1.1.proj_in.weight", "input_blocks.1.1.proj_in.bias", "input_blocks.1.1.transformer_blocks.0.attn1.to_q.weight", "input_blocks.1.1.transformer_blocks.0.attn1.to_k.weight", "input_blocks.1.1.transformer_blocks.0.attn1.to_v.weight", "input_blocks.1.1.transformer_blocks.0.attn1.to_out.0.weight", "input_blocks.1.1.transformer_blocks.0.attn1.to_out.0.bias", "input_blocks.1.1.transformer_blocks.0.ff.net.0.proj.weight", "input_blocks.1.1.transformer_blocks.0.ff.net.0.proj.bias", "input_blocks.1.1.transformer_blocks.0.ff.net.2.weight", "input_blocks.1.1.transformer_blocks.0.ff.net.2.bias", "input_blocks.1.1.transformer_blocks.0.attn2.to_q.weight", "input_blocks.1.1.transformer_blocks.0.attn2.to_k.weight", "input_blocks.1.1.transformer_blocks.0.attn2.to_v.weight", "input_blocks.1.1.transformer_blocks.0.attn2.to_out.0.weight", "input_blocks.1.1.transformer_blocks.0.attn2.to_out.0.bias", "input_blocks.1.1.transformer_blocks.0.norm1.weight", "input_blocks.1.1.transformer_blocks.0.norm1.bias", "input_blocks.1.1.transformer_blocks.0.norm2.weight", "input_blocks.1.1.transformer_blocks.0.norm2.bias", "input_blocks.1.1.transformer_blocks.0.norm3.weight", "input_blocks.1.1.transformer_blocks.0.norm3.bias", "input_blocks.1.1.proj_out.weight", "input_blocks.1.1.proj_out.bias", "input_blocks.2.0.in_layers.0.weight", "input_blocks.2.0.in_layers.0.bias", "input_blocks.2.0.in_layers.2.weight", "input_blocks.2.0.in_layers.2.bias", "input_blocks.2.0.emb_layers.1.weight", "input_blocks.2.0.emb_layers.1.bias", "input_blocks.2.0.out_layers.0.weight", "input_blocks.2.0.out_layers.0.bias", "input_blocks.2.0.out_layers.3.weight", "input_blocks.2.0.out_layers.3.bias", "input_blocks.2.1.norm.weight", "input_blocks.2.1.norm.bias", "input_blocks.2.1.proj_in.weight", "input_blocks.2.1.proj_in.bias", "input_blocks.2.1.transformer_blocks.0.attn1.to_q.weight", "input_blocks.2.1.transformer_blocks.0.attn1.to_k.weight", "input_blocks.2.1.transformer_blocks.0.attn1.to_v.weight", "input_blocks.2.1.transformer_blocks.0.attn1.to_out.0.weight", "input_blocks.2.1.transformer_blocks.0.attn1.to_out.0.bias", "input_blocks.2.1.transformer_blocks.0.ff.net.0.proj.weight", "input_blocks.2.1.transformer_blocks.0.ff.net.0.proj.bias", "input_blocks.2.1.transformer_blocks.0.ff.net.2.weight", "input_blocks.2.1.transformer_blocks.0.ff.net.2.bias", "input_blocks.2.1.transformer_blocks.0.attn2.to_q.weight", "input_blocks.2.1.transformer_blocks.0.attn2.to_k.weight", "input_blocks.2.1.transformer_blocks.0.attn2.to_v.weight", "input_blocks.2.1.transformer_blocks.0.attn2.to_out.0.weight", "input_blocks.2.1.transformer_blocks.0.attn2.to_out.0.bias", "input_blocks.2.1.transformer_blocks.0.norm1.weight", "input_blocks.2.1.transformer_blocks.0.norm1.bias", "input_blocks.2.1.transformer_blocks.0.norm2.weight", "input_blocks.2.1.transformer_blocks.0.norm2.bias", "input_blocks.2.1.transformer_blocks.0.norm3.weight", "input_blocks.2.1.transformer_blocks.0.norm3.bias", "input_blocks.2.1.proj_out.weight", "input_blocks.2.1.proj_out.bias", "input_blocks.3.0.op.weight", "input_blocks.3.0.op.bias", "input_blocks.4.0.in_layers.0.weight", "input_blocks.4.0.in_layers.0.bias", "input_blocks.4.0.in_layers.2.weight", "input_blocks.4.0.in_layers.2.bias", "input_blocks.4.0.emb_layers.1.weight", "input_blocks.4.0.emb_layers.1.bias", "input_blocks.4.0.out_layers.0.weight", "input_blocks.4.0.out_layers.0.bias", "input_blocks.4.0.out_layers.3.weight", "input_blocks.4.0.out_layers.3.bias", "input_blocks.4.0.skip_connection.weight", "input_blocks.4.0.skip_connection.bias", "input_blocks.4.1.norm.weight", "input_blocks.4.1.norm.bias", "input_blocks.4.1.proj_in.weight", "input_blocks.4.1.proj_in.bias", "input_blocks.4.1.transformer_blocks.0.attn1.to_q.weight", "input_blocks.4.1.transformer_blocks.0.attn1.to_k.weight", "input_blocks.4.1.transformer_blocks.0.attn1.to_v.weight", "input_blocks.4.1.transformer_blocks.0.attn1.to_out.0.weight", "input_blocks.4.1.transformer_blocks.0.attn1.to_out.0.bias", "input_blocks.4.1.transformer_blocks.0.ff.net.0.proj.weight", "input_blocks.4.1.transformer_blocks.0.ff.net.0.proj.bias", "input_blocks.4.1.transformer_blocks.0.ff.net.2.weight", "input_blocks.4.1.transformer_blocks.0.ff.net.2.bias", "input_blocks.4.1.transformer_blocks.0.attn2.to_q.weight", "input_blocks.4.1.transformer_blocks.0.attn2.to_k.weight", "input_blocks.4.1.transformer_blocks.0.attn2.to_v.weight", "input_blocks.4.1.transformer_blocks.0.attn2.to_out.0.weight", "input_blocks.4.1.transformer_blocks.0.attn2.to_out.0.bias", "input_blocks.4.1.transformer_blocks.0.norm1.weight", "input_blocks.4.1.transformer_blocks.0.norm1.bias", "input_blocks.4.1.transformer_blocks.0.norm2.weight", "input_blocks.4.1.transformer_blocks.0.norm2.bias", "input_blocks.4.1.transformer_blocks.0.norm3.weight", "input_blocks.4.1.transformer_blocks.0.norm3.bias", "input_blocks.4.1.proj_out.weight", "input_blocks.4.1.proj_out.bias", "input_blocks.5.0.in_layers.0.weight", "input_blocks.5.0.in_layers.0.bias", "input_blocks.5.0.in_layers.2.weight", "input_blocks.5.0.in_layers.2.bias", "input_blocks.5.0.emb_layers.1.weight", "input_blocks.5.0.emb_layers.1.bias", "input_blocks.5.0.out_layers.0.weight", "input_blocks.5.0.out_layers.0.bias", "input_blocks.5.0.out_layers.3.weight", "input_blocks.5.0.out_layers.3.bias", "input_blocks.5.1.norm.weight", "input_blocks.5.1.norm.bias", "input_blocks.5.1.proj_in.weight", "input_blocks.5.1.proj_in.bias", "input_blocks.5.1.transformer_blocks.0.attn1.to_q.weight", "input_blocks.5.1.transformer_blocks.0.attn1.to_k.weight", "input_blocks.5.1.transformer_blocks.0.attn1.to_v.weight", "input_blocks.5.1.transformer_blocks.0.attn1.to_out.0.weight", "input_blocks.5.1.transformer_blocks.0.attn1.to_out.0.bias", "input_blocks.5.1.transformer_blocks.0.ff.net.0.proj.weight", "input_blocks.5.1.transformer_blocks.0.ff.net.0.proj.bias", "input_blocks.5.1.transformer_blocks.0.ff.net.2.weight", "input_blocks.5.1.transformer_blocks.0.ff.net.2.bias", "input_blocks.5.1.transformer_blocks.0.attn2.to_q.weight", "input_blocks.5.1.transformer_blocks.0.attn2.to_k.weight", "input_blocks.5.1.transformer_blocks.0.attn2.to_v.weight", "input_blocks.5.1.transformer_blocks.0.attn2.to_out.0.weight", "input_blocks.5.1.transformer_blocks.0.attn2.to_out.0.bias", "input_blocks.5.1.transformer_blocks.0.norm1.weight", "input_blocks.5.1.transformer_blocks.0.norm1.bias", "input_blocks.5.1.transformer_blocks.0.norm2.weight", "input_blocks.5.1.transformer_blocks.0.norm2.bias", "input_blocks.5.1.transformer_blocks.0.norm3.weight", "input_blocks.5.1.transformer_blocks.0.norm3.bias", "input_blocks.5.1.proj_out.weight", "input_blocks.5.1.proj_out.bias", "input_blocks.6.0.op.weight", "input_blocks.6.0.op.bias", "input_blocks.7.0.in_layers.0.weight", "input_blocks.7.0.in_layers.0.bias", "input_blocks.7.0.in_layers.2.weight", "input_blocks.7.0.in_layers.2.bias", "input_blocks.7.0.emb_layers.1.weight", "input_blocks.7.0.emb_layers.1.bias", "input_blocks.7.0.out_layers.0.weight", "input_blocks.7.0.out_layers.0.bias", "input_blocks.7.0.out_layers.3.weight", "input_blocks.7.0.out_layers.3.bias", "input_blocks.7.0.skip_connection.weight", "input_blocks.7.0.skip_connection.bias", "input_blocks.7.1.norm.weight", "input_blocks.7.1.norm.bias", "input_blocks.7.1.proj_in.weight", "input_blocks.7.1.proj_in.bias", "input_blocks.7.1.transformer_blocks.0.attn1.to_q.weight", "input_blocks.7.1.transformer_blocks.0.attn1.to_k.weight", "input_blocks.7.1.transformer_blocks.0.attn1.to_v.weight", "input_blocks.7.1.transformer_blocks.0.attn1.to_out.0.weight", "input_blocks.7.1.transformer_blocks.0.attn1.to_out.0.bias", "input_blocks.7.1.transformer_blocks.0.ff.net.0.proj.weight", "input_blocks.7.1.transformer_blocks.0.ff.net.0.proj.bias", "input_blocks.7.1.transformer_blocks.0.ff.net.2.weight", "input_blocks.7.1.transformer_blocks.0.ff.net.2.bias", "input_blocks.7.1.transformer_blocks.0.attn2.to_q.weight", "input_blocks.7.1.transformer_blocks.0.attn2.to_k.weight", "input_blocks.7.1.transformer_blocks.0.attn2.to_v.weight", "input_blocks.7.1.transformer_blocks.0.attn2.to_out.0.weight", "input_blocks.7.1.transformer_blocks.0.attn2.to_out.0.bias", "input_blocks.7.1.transformer_blocks.0.norm1.weight", "input_blocks.7.1.transformer_blocks.0.norm1.bias", "input_blocks.7.1.transformer_blocks.0.norm2.weight", "input_blocks.7.1.transformer_blocks.0.norm2.bias", "input_blocks.7.1.transformer_blocks.0.norm3.weight", "input_blocks.7.1.transformer_blocks.0.norm3.bias", "input_blocks.7.1.proj_out.weight", "input_blocks.7.1.proj_out.bias", "input_blocks.8.0.in_layers.0.weight", "input_blocks.8.0.in_layers.0.bias", "input_blocks.8.0.in_layers.2.weight", "input_blocks.8.0.in_layers.2.bias", "input_blocks.8.0.emb_layers.1.weight", "input_blocks.8.0.emb_layers.1.bias", "input_blocks.8.0.out_layers.0.weight", "input_blocks.8.0.out_layers.0.bias", "input_blocks.8.0.out_layers.3.weight", "input_blocks.8.0.out_layers.3.bias", "input_blocks.8.1.norm.weight", "input_blocks.8.1.norm.bias", "input_blocks.8.1.proj_in.weight", "input_blocks.8.1.proj_in.bias", "input_blocks.8.1.transformer_blocks.0.attn1.to_q.weight", "input_blocks.8.1.transformer_blocks.0.attn1.to_k.weight", "input_blocks.8.1.transformer_blocks.0.attn1.to_v.weight", "input_blocks.8.1.transformer_blocks.0.attn1.to_out.0.weight", "input_blocks.8.1.transformer_blocks.0.attn1.to_out.0.bias", "input_blocks.8.1.transformer_blocks.0.ff.net.0.proj.weight", "input_blocks.8.1.transformer_blocks.0.ff.net.0.proj.bias", "input_blocks.8.1.transformer_blocks.0.ff.net.2.weight", "input_blocks.8.1.transformer_blocks.0.ff.net.2.bias", "input_blocks.8.1.transformer_blocks.0.attn2.to_q.weight", "input_blocks.8.1.transformer_blocks.0.attn2.to_k.weight", "input_blocks.8.1.transformer_blocks.0.attn2.to_v.weight", "input_blocks.8.1.transformer_blocks.0.attn2.to_out.0.weight", "input_blocks.8.1.transformer_blocks.0.attn2.to_out.0.bias", "input_blocks.8.1.transformer_blocks.0.norm1.weight", "input_blocks.8.1.transformer_blocks.0.norm1.bias", "input_blocks.8.1.transformer_blocks.0.norm2.weight", "input_blocks.8.1.transformer_blocks.0.norm2.bias", "input_blocks.8.1.transformer_blocks.0.norm3.weight", "input_blocks.8.1.transformer_blocks.0.norm3.bias", "input_blocks.8.1.proj_out.weight", "input_blocks.8.1.proj_out.bias", "input_blocks.9.0.op.weight", "input_blocks.9.0.op.bias", "input_blocks.10.0.in_layers.0.weight", "input_blocks.10.0.in_layers.0.bias", "input_blocks.10.0.in_layers.2.weight", "input_blocks.10.0.in_layers.2.bias", "input_blocks.10.0.emb_layers.1.weight", "input_blocks.10.0.emb_layers.1.bias", "input_blocks.10.0.out_layers.0.weight", "input_blocks.10.0.out_layers.0.bias", "input_blocks.10.0.out_layers.3.weight", "input_blocks.10.0.out_layers.3.bias", "input_blocks.11.0.in_layers.0.weight", "input_blocks.11.0.in_layers.0.bias", "input_blocks.11.0.in_layers.2.weight", "input_blocks.11.0.in_layers.2.bias", "input_blocks.11.0.emb_layers.1.weight", "input_blocks.11.0.emb_layers.1.bias", "input_blocks.11.0.out_layers.0.weight", "input_blocks.11.0.out_layers.0.bias", "input_blocks.11.0.out_layers.3.weight", "input_blocks.11.0.out_layers.3.bias", "zero_convs.0.0.weight", "zero_convs.0.0.bias", "zero_convs.1.0.weight", "zero_convs.1.0.bias", "zero_convs.2.0.weight", "zero_convs.2.0.bias", "zero_convs.3.0.weight", "zero_convs.3.0.bias", "zero_convs.4.0.weight", "zero_convs.4.0.bias", "zero_convs.5.0.weight", "zero_convs.5.0.bias", "zero_convs.6.0.weight", "zero_convs.6.0.bias", "zero_convs.7.0.weight", "zero_convs.7.0.bias", "zero_convs.8.0.weight", "zero_convs.8.0.bias", "zero_convs.9.0.weight", "zero_convs.9.0.bias", "zero_convs.10.0.weight", "zero_convs.10.0.bias", "zero_convs.11.0.weight", "zero_convs.11.0.bias", "input_hint_block.0.weight", "input_hint_block.0.bias", "input_hint_block.2.weight", "input_hint_block.2.bias", "input_hint_block.4.weight", "input_hint_block.4.bias", "input_hint_block.6.weight", "input_hint_block.6.bias", "input_hint_block.8.weight", "input_hint_block.8.bias", "input_hint_block.10.weight", "input_hint_block.10.bias", "input_hint_block.12.weight", "input_hint_block.12.bias", "input_hint_block.14.weight", "input_hint_block.14.bias", "middle_block.0.in_layers.0.weight", "middle_block.0.in_layers.0.bias", "middle_block.0.in_layers.2.weight", "middle_block.0.in_layers.2.bias", "middle_block.0.emb_layers.1.weight", "middle_block.0.emb_layers.1.bias", "middle_block.0.out_layers.0.weight", "middle_block.0.out_layers.0.bias", "middle_block.0.out_layers.3.weight", "middle_block.0.out_layers.3.bias", "middle_block.1.norm.weight", "middle_block.1.norm.bias", "middle_block.1.proj_in.weight", "middle_block.1.proj_in.bias", "middle_block.1.transformer_blocks.0.attn1.to_q.weight", "middle_block.1.transformer_blocks.0.attn1.to_k.weight", "middle_block.1.transformer_blocks.0.attn1.to_v.weight", "middle_block.1.transformer_blocks.0.attn1.to_out.0.weight", "middle_block.1.transformer_blocks.0.attn1.to_out.0.bias", "middle_block.1.transformer_blocks.0.ff.net.0.proj.weight", "middle_block.1.transformer_blocks.0.ff.net.0.proj.bias", "middle_block.1.transformer_blocks.0.ff.net.2.weight", "middle_block.1.transformer_blocks.0.ff.net.2.bias", "middle_block.1.transformer_blocks.0.attn2.to_q.weight", "middle_block.1.transformer_blocks.0.attn2.to_k.weight", "middle_block.1.transformer_blocks.0.attn2.to_v.weight", "middle_block.1.transformer_blocks.0.attn2.to_out.0.weight", "middle_block.1.transformer_blocks.0.attn2.to_out.0.bias", "middle_block.1.transformer_blocks.0.norm1.weight", "middle_block.1.transformer_blocks.0.norm1.bias", "middle_block.1.transformer_blocks.0.norm2.weight", "middle_block.1.transformer_blocks.0.norm2.bias", "middle_block.1.transformer_blocks.0.norm3.weight", "middle_block.1.transformer_blocks.0.norm3.bias", "middle_block.1.proj_out.weight", "middle_block.1.proj_out.bias", "middle_block.2.in_layers.0.weight", "middle_block.2.in_layers.0.bias", "middle_block.2.in_layers.2.weight", "middle_block.2.in_layers.2.bias", "middle_block.2.emb_layers.1.weight", "middle_block.2.emb_layers.1.bias", "middle_block.2.out_layers.0.weight", "middle_block.2.out_layers.0.bias", "middle_block.2.out_layers.3.weight", "middle_block.2.out_layers.3.bias", "middle_block_out.0.weight", "middle_block_out.0.bias".
        Unexpected key(s) in state_dict: "style_embedding", "proj", "transformer_layes.0.attn.in_proj_weight", "transformer_layes.0.attn.in_proj_bias", "transformer_layes.0.attn.out_proj.weight", "transformer_layes.0.attn.out_proj.bias", "transformer_layes.0.ln_1.weight", "transformer_layes.0.ln_1.bias", "transformer_layes.0.mlp.c_fc.weight", "transformer_layes.0.mlp.c_fc.bias", "transformer_layes.0.mlp.c_proj.weight", "transformer_layes.0.mlp.c_proj.bias", "transformer_layes.0.ln_2.weight", "transformer_layes.0.ln_2.bias", "transformer_layes.1.attn.in_proj_weight", "transformer_layes.1.attn.in_proj_bias", "transformer_layes.1.attn.out_proj.weight", "transformer_layes.1.attn.out_proj.bias", "transformer_layes.1.ln_1.weight", "transformer_layes.1.ln_1.bias", "transformer_layes.1.mlp.c_fc.weight", "transformer_layes.1.mlp.c_fc.bias", "transformer_layes.1.mlp.c_proj.weight", "transformer_layes.1.mlp.c_proj.bias", "transformer_layes.1.ln_2.weight", "transformer_layes.1.ln_2.bias", "transformer_layes.2.attn.in_proj_weight", "transformer_layes.2.attn.in_proj_bias", "transformer_layes.2.attn.out_proj.weight", "transformer_layes.2.attn.out_proj.bias", "transformer_layes.2.ln_1.weight", "transformer_layes.2.ln_1.bias", "transformer_layes.2.mlp.c_fc.weight", "transformer_layes.2.mlp.c_fc.bias", "transformer_layes.2.mlp.c_proj.weight", "transformer_layes.2.mlp.c_proj.bias", "transformer_layes.2.ln_2.weight", "transformer_layes.2.ln_2.bias", "ln_post.weight", "ln_post.bias", "ln_pre.weight", "ln_pre.bias".

100%|██████████████████████████████████████████████████████████████████████████████████| 20/20 [00:02<00:00,  8.90it/s]
Total progress: 100%|██████████████████████████████████████████████████████████████████| 20/20 [00:02<00:00,  8.07it/s]
AugmentedRealityCat commented 1 year ago

There is now a live demo app of the T2i Style model on Huggingface - they just restarted the whole thing to launch the new version. https://huggingface.co/spaces/Adapter/T2I-Adapter

specblades commented 1 year ago

@Mikubill seems StyleAdapter needs more than "2" maximum weght It completely blends with big prompts and do nothing

UPD Seems like weight does no effect on StyleAdapter

FurkanGozukara commented 1 year ago

@AugmentedRealityCat any ideas?

https://github.com/Mikubill/sd-webui-controlnet/issues/512

FurkanGozukara commented 1 year ago

@specblades any ideas?

https://github.com/Mikubill/sd-webui-controlnet/issues/512

joket1999 commented 1 year ago

The same error for me.

Hecatoncheir commented 1 year ago

This error can be if t2iadapter_style_sd14v1.yaml not in: stable-diffusion-webui\extensions\sd-webui-controlnet\models

FurkanGozukara commented 1 year ago

Errors fixed here a tutorial

https://youtu.be/tXaQAkOgezQ

erkerw commented 1 year ago

check your "yml" files in "models" folder, i notice the downloaded model yml file named incorretly. Some files not end with ".yml", this is an offical mistake.

joket1999 commented 1 year ago

The same error for me.

It works after execgit pull in stable-diffusion-webui\extensions\sd-webui-controlnet