Closed Shangooriginal closed 10 months ago
Can you provide more information on how to reproduce the issue? Running reference_only seems fine in my local environment.
I also encountered a similar error. It seems to occur when using --xformers in version 1.7. It seems that the error does not occur if vemv is deleted and --xformers is not used. I have not thoroughly verified it, and it is not exactly the same error, so this may not be the cause. I would be happy if it is helpful.
Which xformers version do you use @garugann. With --xformers flag in 1.7.0 A1111, it seems to work fine for me.
version: v1.7.0 • python: 3.10.11 • torch: 2.0.1+cu118 • xformers: 0.0.20 • gradio: 3.41.2 • checkpoint: 735df1f05d
It seems that an error occurs in this environment. To be honest, I don't know if this is the correct cause. In the meantime, I can generate Reference successfully when --xformers is n/a.
I have following env and reference is worknig fine: version: v1.7.0-RC-5-gf92d6149 • python: 3.10.6 • torch: 2.0.1+cu118 • xformers: 0.0.20 • gradio: 3.41.2 • checkpoint: 79e42fb744
While I was comparing it to a completely new environment, I realized that the config.json file might be outdated. After deleting it and resetting the config, it seems to be operating without any problems even in the same environment. version: v1.7.0 • python: 3.10.11 • torch: 2.0.1+cu118 • xformers: 0.0.20 • gradio: 3.41.2 •
I think it might be Hypertile. If I turn it on or off in the settings, the error appears or disappears.
Traceback (most recent call last):
File "D:\gongxiang\stable-diffusion-webui\modules\call_queue.py", line 57, in f
res = list(func(*args, kwargs))
^^^^^^^^^^^^^^^^^^^^^
File "D:\gongxiang\stable-diffusion-webui\modules\call_queue.py", line 36, in f
res = func(*args, *kwargs)
^^^^^^^^^^^^^^^^^^^^^
File "D:\gongxiang\stable-diffusion-webui\modules\txt2img.py", line 55, in txt2img
processed = processing.process_images(p)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "D:\gongxiang\stable-diffusion-webui\modules\processing.py", line 734, in process_images
res = process_images_inner(p)
^^^^^^^^^^^^^^^^^^^^^^^
File "D:\gongxiang\stable-diffusion-webui\extensions\sd-webui-controlnet\scripts\batch_hijack.py", line 41, in processing_process_images_hijack
return getattr(processing, '__controlnet_original_process_images_inner')(p, args, kwargs)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "D:\gongxiang\stable-diffusion-webui\modules\processing.py", line 868, in process_images_inner
samples_ddim = p.sample(conditioning=p.c, unconditional_conditioning=p.uc, seeds=p.seeds, subseeds=p.subseeds, subseed_strength=p.subseed_strength, prompts=p.prompts)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "D:\gongxiang\stable-diffusion-webui\extensions\sd-webui-controlnet\scripts\hook.py", line 435, in process_sample
return process.sample_before_CN_hack(*args, kwargs)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "D:\gongxiang\stable-diffusion-webui\modules\processing.py", line 1142, in sample
samples = self.sampler.sample(self, x, conditioning, unconditional_conditioning, image_conditioning=self.txt2img_image_conditioning(x))
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "D:\gongxiang\stable-diffusion-webui\modules\sd_samplers_kdiffusion.py", line 235, in sample
samples = self.launch_sampling(steps, lambda: self.func(self.model_wrap_cfg, x, extra_args=self.sampler_extra_args, disable=False, callback=self.callback_state, extra_params_kwargs))
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "D:\gongxiang\stable-diffusion-webui\modules\sd_samplers_common.py", line 261, in launch_sampling
return func()
^^^^^^
File "D:\gongxiang\stable-diffusion-webui\modules\sd_samplers_kdiffusion.py", line 235, in expand
yourself before calling memory_efficient_attention
if you need to
The same error comes from the hypertile call. At the moment I have no intention of tracking down whether it is a problem with hypertile or python 3.11. Either I roll back the environment to xformers-17, which I don't want to do. I currently choose to turn off hypertile to avoid this error. Maybe I should file a bug with webui?
Environment os: window10 browser: firefox gpu: nvidia rtx 4070 ti cpu: intel cuda: 11.8 cudnn: 8.9.7
Webui version: v1.7.0 python: 3.10.13 torch: 2.0.1+cu118 xformers: 0.0.20 gradio: 3.41.2 checkpoint: 1449e5b0b9 arguments: --xformers --xformers-flash-attention --no-half-vae
Controlnet version: v1.1.440 checkpoint: 9a5f2883 preprocessor: canny model: controlnetxlCNXL_bdsqlszCanny [a74daa41]
In addition to canny, I get an error
Model AnimagineXL_V3 (base on SDXL 1.0)
Vae sdxl-vae
I'm experiencing something similar, albeit a different error.
I'm using Tiled VAE with Hypertile and DeepCache.
I don't use Lora or Embedding.
In my case, turning off xformers was the same.
When using the SDXL model, if I enable Hypertile or DeepCache and use ControlNet, I get an error.
The error is as follows
RuntimeError: Sizes of tensors must match except in dimension 2. Expected size 4096 but got size 2048 for tensor number 1 in the list.
Is there an existing issue for this?
What happened?
All the Reference Preprocessors are generating errors when I try to use them either with SD1.5 and SDXL models.
Steps to reproduce the problem
I upload picture for reference and adjust the settings, click generate and error message appears. See below
RuntimeError: shape '[81920, 8, 40]' is invalid for input of size 3276800
What should have happened?
It should just work and generate a new reference image output
Commit where the problem happens
webui: version: [v1.7.0-RC-4-g120a84bd] controlnet: [a13bd2fe]
What browsers do you use to access the UI ?
Google Chrome
Command Line Arguments
List of enabled extensions
Non applicable
Console logs
Additional information
No response