lllyasviel / stable-diffusion-webui-forge

GNU Affero General Public License v3.0
8.06k stars 788 forks source link

[Bug]: SD difference (script) sd-webui-IS-NET-pro cannot input images to controlnet #439

Open k52252467 opened 7 months ago

k52252467 commented 7 months ago

Checklist

What happened?

I'm in stable-diffusion-webui, You can use the sd-webui-IS-NET-pro plug-in (script) normally, I use his Txt_to_Image_multi_frame_rendering.py function, Used in T2I, But it cannot be used in stable-diffusion-webui-forge. He can't seem to use Use another image as ControlNet input into ControlNet

This is the URL of the plugin https://github.com/ClockZinc/sd-webui-IS-NET-pro

I know this may not be related to your script, but I also can't use the original ControlNet plugin in your stable-diffusion-webui-forge, it gets disabled after installation,

Does your built-in ControlNet not support certain scripts or plug-ins?

You can see in the LOG, it shows that I have used any image, but I use sd-webui-IS-NET-pro and it can Use another image as ControlNet input

螢幕擷取畫面 2024-02-28 223052

In the picture, although I checked Use another image as ControlNet input, But it can still be used normally in the original stable-diffusion-webui. Of course, I also tried to check it, but it still cannot be used in stable-diffusion-webui-forge.

simply put, stable-diffusion-webui is used normally, Not available in stable-diffusion-webui-forge

Steps to reproduce the problem

Use sd-webui-IS-NET-pro plug-in (script) for T2I, Txt_to_Image_multi_frame_rendering.py,

He cannot input image into ControlNet

What should have happened?

He should be able to use Use another image as ControlNet input to controlnet, Because I need hundreds to thousands of pictures to convert at a time, He should convert accordingly according to the picture number.

What browsers do you use to access the UI ?

No response

Sysinfo

sysinfo-2024-02-28-14-18.json

Console logs

Python 3.10.6 (tags/v3.10.6:9c7b4bd, Aug  1 2022, 21:53:49) [MSC v.1932 64 bit (AMD64)]
Version: f0.0.16v1.8.0rc-latest-268-gb59deaa3
Commit hash: b59deaa382bf5c968419eff4559f7d06fc0e76e7
loading WD14-tagger reqs from C:\webui_forge_cu121_torch21\webui\extensions\stable-diffusion-webui-wd14-tagger\requirements.txt
Checking WD14-tagger requirements.
Launching Web UI with arguments: --xformers --ckpt-dir C:/stable-diffusion-webui/models/Stable-diffusion --hypernetwork-dir C:/stable-diffusion-webui/models/hypernetworks --embeddings-dir C:/stable-diffusion-webui/embeddings --lora-dir C:/stable-diffusion-webui/models/Lora --vae-dir C:/stable-diffusion-webui/models/VAE
Total VRAM 24564 MB, total RAM 65277 MB
WARNING:xformers:A matching Triton is not available, some optimizations will not be enabled.
Error caught was: No module named 'triton'
xformers version: 0.0.23.post1
Set vram state to: NORMAL_VRAM
Device: cuda:0 NVIDIA GeForce RTX 4090 : native
Hint: your device supports --pin-shared-memory for potential speed improvements.
Hint: your device supports --cuda-malloc for potential speed improvements.
Hint: your device supports --cuda-stream for potential speed improvements.
VAE dtype: torch.bfloat16
CUDA Stream Activated:  False
2024-02-28 22:23:14.774863: I tensorflow/core/util/port.cc:113] oneDNN custom operations are on. You may see slightly different numerical results due to floating-point round-off errors from different computation orders. To turn them off, set the environment variable `TF_ENABLE_ONEDNN_OPTS=0`.
WARNING:tensorflow:From C:\webui_forge_cu121_torch21\system\python\lib\site-packages\keras\src\losses.py:2976: The name tf.losses.sparse_softmax_cross_entropy is deprecated. Please use tf.compat.v1.losses.sparse_softmax_cross_entropy instead.

Using xformers cross attention
ControlNet preprocessor location: C:\webui_forge_cu121_torch21\webui\models\ControlNetPreprocessor
*** Error loading script: controlnet.py
    Traceback (most recent call last):
      File "C:\webui_forge_cu121_torch21\webui\modules\scripts.py", line 544, in load_scripts
        script_module = script_loading.load_module(scriptfile.path)
      File "C:\webui_forge_cu121_torch21\webui\modules\script_loading.py", line 10, in load_module
        module_spec.loader.exec_module(module)
      File "<frozen importlib._bootstrap_external>", line 883, in exec_module
      File "<frozen importlib._bootstrap>", line 241, in _call_with_frames_removed
      File "C:\webui_forge_cu121_torch21\webui\extensions-builtin\sd_forge_controlnet\scripts\controlnet.py", line 15, in <module>
        from lib_controlnet.utils import (
    ImportError: cannot import name 'try_unfold_unit' from 'lib_controlnet.utils' (C:\webui_forge_cu121_torch21\webui\extensions-builtin\sd_forge_controlnet\lib_controlnet\utils.py)

---
Civitai Helper: Get Custom Model Folder
[-] ADetailer initialized. version: 24.1.2, num models: 14
sd-webui-prompt-all-in-one background API service started successfully.
[ControlNet-Travel] extension Mikubill/sd-webui-controlnet not found, ControlNet-Travel ignored :(
*** Error loading script: controlnet_travel.py
    Traceback (most recent call last):
      File "C:\webui_forge_cu121_torch21\webui\modules\scripts.py", line 544, in load_scripts
        script_module = script_loading.load_module(scriptfile.path)
      File "C:\webui_forge_cu121_torch21\webui\modules\script_loading.py", line 10, in load_module
        module_spec.loader.exec_module(module)
      File "<frozen importlib._bootstrap_external>", line 883, in exec_module
      File "<frozen importlib._bootstrap>", line 241, in _call_with_frames_removed
      File "C:\webui_forge_cu121_torch21\webui\extensions\stable-diffusion-webui-prompt-travel\scripts\controlnet_travel.py", line 99, in <module>
        def hook_hijack(self:UnetHook, model:UNetModel, sd_ldm:LatentDiffusion, control_params:List[ControlParams], process:Processing, batch_option_uint_separate=False, batch_option_style_align=False):
    NameError: name 'UnetHook' is not defined

---
== WD14 tagger /gpu:0, uname_result(system='Windows', node='DESKTOP-5KD9NI7', release='10', version='10.0.22631', machine='AMD64') ==
2024-02-28 22:23:21,949 - AnimateDiff - INFO - Injecting LCM to UI.
Loading weights [b307772c81] from C:\stable-diffusion-webui\models\Stable-diffusion\MAIN\hanimix_real_v20c.safetensors
model_type EPS
UNet ADM Dimension 0
2024-02-28 22:23:22,339 - AnimateDiff - INFO - Hacking i2i-batch.
2024-02-28 22:23:22,390 - ControlNet - INFO - ControlNet UI callback registered.
Civitai Helper: Settings:
Civitai Helper: max_size_preview: True
Civitai Helper: skip_nsfw_preview: False
Civitai Helper: open_url_with_js: True
Civitai Helper: proxy:
Civitai Helper: use civitai api key: False
*** Error executing callback ui_tabs_callback for C:\webui_forge_cu121_torch21\webui\extensions\sd-webui-deforum\scripts\deforum.py
    Traceback (most recent call last):
      File "C:\webui_forge_cu121_torch21\webui\modules\script_callbacks.py", line 183, in ui_tabs_callback
        res += c.callback() or []
      File "C:\webui_forge_cu121_torch21\webui\extensions\sd-webui-deforum\scripts\deforum_helpers\ui_right.py", line 92, in on_ui_tabs
        deforum_gallery, generation_info, html_info, _ = create_output_panel("deforum", opts.outdir_img2img_samples)
    TypeError: cannot unpack non-iterable OutputPanel object

---
Running on local URL:  http://127.0.0.1:7862
Using xformers attention in VAE
Working with z of shape (1, 4, 32, 32) = 4096 dimensions.
Using xformers attention in VAE
extra {'cond_stage_model.clip_l.text_projection', 'cond_stage_model.clip_l.logit_scale'}
left over keys: dict_keys(['alphas_cumprod', 'alphas_cumprod_prev', 'betas', 'log_one_minus_alphas_cumprod', 'posterior_log_variance_clipped', 'posterior_mean_coef1', 'posterior_mean_coef2', 'posterior_variance', 'sqrt_alphas_cumprod', 'sqrt_one_minus_alphas_cumprod', 'sqrt_recip_alphas_cumprod', 'sqrt_recipm1_alphas_cumprod'])

To create a public link, set `share=True` in `launch()`.
Loading VAE weights specified in settings: C:\stable-diffusion-webui\models\VAE\vae-ft-mse-840000-ema-pruned.safetensors
Startup time: 17.4s (prepare environment: 3.6s, import torch: 2.2s, import gradio: 0.6s, setup paths: 3.4s, other imports: 0.3s, load scripts: 3.5s, create ui: 0.9s, gradio launch: 2.4s, app_started_callback: 0.3s).
To load target model SD1ClipModel
Begin to load 1 model
[Memory Management] Current Free GPU Memory (MB) =  22991.81640625
[Memory Management] Model Memory (MB) =  454.2076225280762
[Memory Management] Minimal Inference Memory (MB) =  1024.0
[Memory Management] Estimated Remaining GPU Memory (MB) =  21513.608783721924
Moving model(s) has taken 0.05 seconds
Model loaded in 4.0s (load weights from disk: 0.2s, forge load real models: 2.6s, load VAE: 0.4s, load textual inversion embeddings: 0.3s, calculate empty prompt: 0.4s).
ISnet::MFR::will process    3 images
ISNET::MFR::Single mode OPEN!!!
*** Error running process: C:\webui_forge_cu121_torch21\webui\extensions-builtin\sd_forge_controlnet\scripts\bk_controlnet.py
    Traceback (most recent call last):
      File "C:\webui_forge_cu121_torch21\webui\modules\scripts.py", line 803, in process
        script.process(p, *script_args)
      File "C:\webui_forge_cu121_torch21\system\python\lib\site-packages\torch\utils\_contextlib.py", line 115, in decorate_context
        return func(*args, **kwargs)
      File "C:\webui_forge_cu121_torch21\webui\extensions-builtin\sd_forge_controlnet\scripts\bk_controlnet.py", line 546, in process
        self.bound_check_params(unit)
      File "C:\webui_forge_cu121_torch21\webui\extensions-builtin\sd_forge_controlnet\scripts\bk_controlnet.py", line 518, in bound_check_params
        if unit.processor_res < 0:
    TypeError: '<' not supported between instances of 'str' and 'int'

---
*** Error running process_before_every_sampling: C:\webui_forge_cu121_torch21\webui\extensions-builtin\sd_forge_controlnet\scripts\bk_controlnet.py
    Traceback (most recent call last):
      File "C:\webui_forge_cu121_torch21\webui\modules\scripts.py", line 835, in process_before_every_sampling
        script.process_before_every_sampling(p, *script_args, **kwargs)
      File "C:\webui_forge_cu121_torch21\system\python\lib\site-packages\torch\utils\_contextlib.py", line 115, in decorate_context
        return func(*args, **kwargs)
      File "C:\webui_forge_cu121_torch21\webui\extensions-builtin\sd_forge_controlnet\scripts\bk_controlnet.py", line 555, in process_before_every_sampling
        self.process_unit_before_every_sampling(p, unit, self.current_params[i], *args, **kwargs)
    KeyError: 0

---
To load target model BaseModel
Begin to load 1 model
[Memory Management] Current Free GPU Memory (MB) =  22594.93603515625
[Memory Management] Model Memory (MB) =  1639.4137649536133
[Memory Management] Minimal Inference Memory (MB) =  1024.0
[Memory Management] Estimated Remaining GPU Memory (MB) =  19931.522270202637
Moving model(s) has taken 0.32 seconds
100%|██████████████████████████████████████████████████████████████████████████████████| 20/20 [00:02<00:00,  7.50it/s]
To load target model AutoencoderKL█████████                                            | 20/60 [00:02<00:04,  8.07it/s]
Begin to load 1 model
[Memory Management] Current Free GPU Memory (MB) =  20913.4794921875
[Memory Management] Model Memory (MB) =  159.55708122253418
[Memory Management] Minimal Inference Memory (MB) =  1024.0
[Memory Management] Estimated Remaining GPU Memory (MB) =  19729.922410964966
Moving model(s) has taken 0.08 seconds
*** Error running postprocess_batch_list: C:\webui_forge_cu121_torch21\webui\extensions-builtin\sd_forge_controlnet\scripts\bk_controlnet.py
    Traceback (most recent call last):
      File "C:\webui_forge_cu121_torch21\webui\modules\scripts.py", line 859, in postprocess_batch_list
        script.postprocess_batch_list(p, pp, *script_args, **kwargs)
      File "C:\webui_forge_cu121_torch21\system\python\lib\site-packages\torch\utils\_contextlib.py", line 115, in decorate_context
        return func(*args, **kwargs)
      File "C:\webui_forge_cu121_torch21\webui\extensions-builtin\sd_forge_controlnet\scripts\bk_controlnet.py", line 561, in postprocess_batch_list
        self.process_unit_after_every_sampling(p, unit, self.current_params[i], pp, *args, **kwargs)
    KeyError: 0

---
*** Error running process: C:\webui_forge_cu121_torch21\webui\extensions-builtin\sd_forge_controlnet\scripts\bk_controlnet.py
    Traceback (most recent call last):
      File "C:\webui_forge_cu121_torch21\webui\modules\scripts.py", line 803, in process
        script.process(p, *script_args)
      File "C:\webui_forge_cu121_torch21\system\python\lib\site-packages\torch\utils\_contextlib.py", line 115, in decorate_context
        return func(*args, **kwargs)
      File "C:\webui_forge_cu121_torch21\webui\extensions-builtin\sd_forge_controlnet\scripts\bk_controlnet.py", line 546, in process
        self.bound_check_params(unit)
      File "C:\webui_forge_cu121_torch21\webui\extensions-builtin\sd_forge_controlnet\scripts\bk_controlnet.py", line 518, in bound_check_params
        if unit.processor_res < 0:
    TypeError: '<' not supported between instances of 'str' and 'int'

---
*** Error running process_before_every_sampling: C:\webui_forge_cu121_torch21\webui\extensions-builtin\sd_forge_controlnet\scripts\bk_controlnet.py
    Traceback (most recent call last):
      File "C:\webui_forge_cu121_torch21\webui\modules\scripts.py", line 835, in process_before_every_sampling
        script.process_before_every_sampling(p, *script_args, **kwargs)
      File "C:\webui_forge_cu121_torch21\system\python\lib\site-packages\torch\utils\_contextlib.py", line 115, in decorate_context
        return func(*args, **kwargs)
      File "C:\webui_forge_cu121_torch21\webui\extensions-builtin\sd_forge_controlnet\scripts\bk_controlnet.py", line 555, in process_before_every_sampling
        self.process_unit_before_every_sampling(p, unit, self.current_params[i], *args, **kwargs)
    KeyError: 0

---
100%|██████████████████████████████████████████████████████████████████████████████████| 20/20 [00:02<00:00,  8.17it/s]
*** Error running postprocess_batch_list: C:\webui_forge_cu121_torch21\webui\extensions-builtin\sd_forge_controlnet\scripts\bk_controlnet.py
    Traceback (most recent call last):
      File "C:\webui_forge_cu121_torch21\webui\modules\scripts.py", line 859, in postprocess_batch_list
        script.postprocess_batch_list(p, pp, *script_args, **kwargs)
      File "C:\webui_forge_cu121_torch21\system\python\lib\site-packages\torch\utils\_contextlib.py", line 115, in decorate_context
        return func(*args, **kwargs)
      File "C:\webui_forge_cu121_torch21\webui\extensions-builtin\sd_forge_controlnet\scripts\bk_controlnet.py", line 561, in postprocess_batch_list
        self.process_unit_after_every_sampling(p, unit, self.current_params[i], pp, *args, **kwargs)
    KeyError: 0

---
*** Error running process: C:\webui_forge_cu121_torch21\webui\extensions-builtin\sd_forge_controlnet\scripts\bk_controlnet.py
    Traceback (most recent call last):
      File "C:\webui_forge_cu121_torch21\webui\modules\scripts.py", line 803, in process
        script.process(p, *script_args)
      File "C:\webui_forge_cu121_torch21\system\python\lib\site-packages\torch\utils\_contextlib.py", line 115, in decorate_context
        return func(*args, **kwargs)
      File "C:\webui_forge_cu121_torch21\webui\extensions-builtin\sd_forge_controlnet\scripts\bk_controlnet.py", line 546, in process
        self.bound_check_params(unit)
      File "C:\webui_forge_cu121_torch21\webui\extensions-builtin\sd_forge_controlnet\scripts\bk_controlnet.py", line 518, in bound_check_params
        if unit.processor_res < 0:
    TypeError: '<' not supported between instances of 'str' and 'int'

---
*** Error running process_before_every_sampling: C:\webui_forge_cu121_torch21\webui\extensions-builtin\sd_forge_controlnet\scripts\bk_controlnet.py
    Traceback (most recent call last):
      File "C:\webui_forge_cu121_torch21\webui\modules\scripts.py", line 835, in process_before_every_sampling
        script.process_before_every_sampling(p, *script_args, **kwargs)
      File "C:\webui_forge_cu121_torch21\system\python\lib\site-packages\torch\utils\_contextlib.py", line 115, in decorate_context
        return func(*args, **kwargs)
      File "C:\webui_forge_cu121_torch21\webui\extensions-builtin\sd_forge_controlnet\scripts\bk_controlnet.py", line 555, in process_before_every_sampling
        self.process_unit_before_every_sampling(p, unit, self.current_params[i], *args, **kwargs)
    KeyError: 0

---
100%|██████████████████████████████████████████████████████████████████████████████████| 20/20 [00:02<00:00,  8.20it/s]
*** Error running postprocess_batch_list: C:\webui_forge_cu121_torch21\webui\extensions-builtin\sd_forge_controlnet\scripts\bk_controlnet.py
    Traceback (most recent call last):
      File "C:\webui_forge_cu121_torch21\webui\modules\scripts.py", line 859, in postprocess_batch_list
        script.postprocess_batch_list(p, pp, *script_args, **kwargs)
      File "C:\webui_forge_cu121_torch21\system\python\lib\site-packages\torch\utils\_contextlib.py", line 115, in decorate_context
        return func(*args, **kwargs)
      File "C:\webui_forge_cu121_torch21\webui\extensions-builtin\sd_forge_controlnet\scripts\bk_controlnet.py", line 561, in postprocess_batch_list
        self.process_unit_after_every_sampling(p, unit, self.current_params[i], pp, *args, **kwargs)
    KeyError: 0

---

-------------------
 ISNET::MFR is DONE!
-------------------
Total progress: 100%|██████████████████████████████████████████████████████████████████| 60/60 [00:10<00:00,  5.46it/s]
Total progress: 100%|██████████████████████████████████████████████████████████████████| 60/60 [00:10<00:00,  8.17it/s]

Additional information

No response

k52252467 commented 7 months ago

螢幕擷取畫面 2024-02-28 234229

Attached is my check that there are Batch Options in ControlNet in stable-diffusion-webui, but there is no ControlNet in your stable-diffusion-webui-forge. This seems to be the reason for the error?

abline11 commented 7 months ago

This is an annoying reproduceable bug in Forge.

The lines of code ending with:

File "C:\AI\stable-diffusion-webui-forge\webui\extensions-builtin\sd_forge_controlnet\scripts\controlnet.py", line 555, in process_before_every_sampling self.process_unit_before_every_sampling(p, unit, self.current_params[i], *args, **kwargs)

... are indicating that before every sampling it is checking the contents of all the ControlNets (by default ControlNet 0, 1 and 2.

If you are using just one with InsightFace+CLIP-H (IPAdapter)/ip-adapter-faceid-plusv2_sdxl [187cb962], you will also see that what's in ControlNet 0 has also been copied down into ControlNets 1 and 2. If you 'un' enable ControlNets 1 and 2 and set the control types in those to ALL thus setting the preprocessor and model combination back to none/none for ControlNets 1 and 2, you will see the error message goes away.

So that's how to get rid of the message. Unfortunately, if you go into settings to try and set this as the Default it will not do so and next time you completely restart Forge from scratch the Controlnets 1 and 2 will be back to what's in ControlNet 0 and the error message will reappear.

It's a shame that making ControlNet a built in extension is causing these issues that don't appear in Automatic1111.

Personally, I still prefer Forge for the speed/memory handling, but I can see why people are put off by these unnecessary bugs.

But, hey ho, it is free.

k52252467 commented 7 months ago

This is an annoying reproduceable bug in Forge.

The lines of code ending with:

File "C:\AI\stable-diffusion-webui-forge\webui\extensions-builtin\sd_forge_controlnet\scripts\controlnet.py", line 555, in process_before_every_sampling self.process_unit_before_every_sampling(p, unit, self.current_params[i], *args, **kwargs)

... are indicating that before every sampling it is checking the contents of all the ControlNets (by default ControlNet 0, 1 and 2.

If you are using just one with InsightFace+CLIP-H (IPAdapter)/ip-adapter-faceid-plusv2_sdxl [187cb962], you will also see that what's in ControlNet 0 has also been copied down into ControlNets 1 and 2. If you 'un' enable ControlNets 1 and 2 and set the control types in those to ALL thus setting the preprocessor and model combination back to none/none for ControlNets 1 and 2, you will see the error message goes away.

So that's how to get rid of the message. Unfortunately, if you go into settings to try and set this as the Default it will not do so and next time you completely restart Forge from scratch the Controlnets 1 and 2 will be back to what's in ControlNet 0 and the error message will reappear.

It's a shame that making ControlNet a built in extension is causing these issues that don't appear in Automatic1111.

Personally, I still prefer Forge for the speed/memory handling, but I can see why people are put off by these unnecessary bugs.

But, hey ho, it is free.

Thank you very much for some insights and explanations. I hope the author can solve this problem soon.