kijai / ComfyUI-SUPIR

SUPIR upscaling wrapper for ComfyUI
Other
1.59k stars 88 forks source link

xFormers wasn't build with CUDA support #122

Open mr-bob-chang opened 6 months ago

mr-bob-chang commented 6 months ago

I have tried uninstalling and reinstalling xFormers but this problem still occurs. The complete error message is as follows:

Error occurred when executing SUPIR_sample:

No operator found for memory_efficient_attention_forward with inputs: query : shape=(40, 1024, 1, 64) (torch.float16) key : shape=(40, 1024, 1, 64) (torch.float16) value : shape=(40, 1024, 1, 64) (torch.float16) attn_bias : p : 0.0 decoderF is not supported because: xFormers wasn't build with CUDA support attn_bias type is operator wasn't built - see python -m xformers.info for more info flshattF@0.0.0 is not supported because: xFormers wasn't build with CUDA support operator wasn't built - see python -m xformers.info for more info cutlassF is not supported because: xFormers wasn't build with CUDA support operator wasn't built - see python -m xformers.info for more info smallkF is not supported because: max(query.shape[-1] != value.shape[-1]) > 32 xFormers wasn't build with CUDA support dtype=torch.float16 (supported: {torch.float32}) operator wasn't built - see python -m xformers.info for more info unsupported embed per head: 64

File "E:\ComfyUI_windows_portable-b\ComfyUI\execution.py", line 151, in recursive_execute output_data, output_ui = get_output_data(obj, input_data_all) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "E:\ComfyUI_windows_portable-b\ComfyUI\execution.py", line 81, in get_output_data return_values = map_node_over_list(obj, input_data_all, obj.FUNCTION, allow_interrupt=True) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "E:\ComfyUI_windows_portable-b\ComfyUI\execution.py", line 74, in map_node_over_list results.append(getattr(obj, func)(slice_dict(input_data_all, i))) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "E:\ComfyUI_windows_portable-b\ComfyUI\custom_nodes\ComfyUI-SUPIR\nodes_v2.py", line 494, in sample _samples = self.sampler(denoiser, noised_z, cond=positive[i], uc=negative[i], x_center=sample.unsqueeze(0), control_scale=control_scale_end, ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "E:\ComfyUI_windows_portable-b\ComfyUI\custom_nodes\ComfyUI-SUPIR\sgm\modules\diffusionmodules\sampling.py", line 657, in call x, old_denoised = self.sampler_step( ^^^^^^^^^^^^^^^^^^ File "E:\ComfyUI_windows_portable-b\ComfyUI\custom_nodes\ComfyUI-SUPIR\sgm\modules\diffusionmodules\sampling.py", line 609, in sampler_step denoised = self.denoise(x, denoiser, sigma, cond, uc, control_scale=control_scale) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "E:\ComfyUI_windows_portable-b\ComfyUI\custom_nodes\ComfyUI-SUPIR\sgm\modules\diffusionmodules\sampling.py", line 573, in denoise denoised = denoiser(self.guider.prepare_inputs(x, sigma, cond, uc), control_scale) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "E:\ComfyUI_windows_portable-b\ComfyUI\custom_nodes\ComfyUI-SUPIR\nodes_v2.py", line 468, in denoiser = lambda input, sigma, c, control_scale: SUPIR_model.denoiser(SUPIR_model.model, input, sigma, c, control_scale) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "E:\ComfyUI_windows_portable-b\ComfyUI\custom_nodes\ComfyUI-SUPIR\sgm\modules\diffusionmodules\denoiser.py", line 73, in call return network(input c_in, c_noise, cond, control_scale) c_out + input c_skip ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "E:\ComfyUI_windows_portable-b\python_embeded\Lib\site-packages\torch\nn\modules\module.py", line 1511, in _wrapped_call_impl return self._call_impl(*args, *kwargs) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "E:\ComfyUI_windows_portable-b\python_embeded\Lib\site-packages\torch\nn\modules\module.py", line 1520, in _call_impl return forward_call(args, kwargs) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "E:\ComfyUI_windows_portable-b\ComfyUI\custom_nodes\ComfyUI-SUPIR\sgm\modules\diffusionmodules\wrappers.py", line 96, in forward out = self.diffusion_model( ^^^^^^^^^^^^^^^^^^^^^ File "E:\ComfyUI_windows_portable-b\python_embeded\Lib\site-packages\torch\nn\modules\module.py", line 1511, in _wrapped_call_impl return self._call_impl(*args, kwargs) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "E:\ComfyUI_windows_portable-b\python_embeded\Lib\site-packages\torch\nn\modules\module.py", line 1520, in _call_impl return forward_call(*args, *kwargs) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "E:\ComfyUI_windows_portable-b\ComfyUI\custom_nodes\ComfyUI-SUPIR\SUPIR\modules\SUPIR_v0.py", line 654, in forward h = self.project_modules[adapter_idx](control[control_idx], h, control_scale=control_scale) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "E:\ComfyUI_windows_portable-b\python_embeded\Lib\site-packages\torch\nn\modules\module.py", line 1511, in _wrapped_call_impl return self._call_impl(args, kwargs) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "E:\ComfyUI_windows_portable-b\python_embeded\Lib\site-packages\torch\nn\modules\module.py", line 1520, in _call_impl return forward_call(*args, kwargs) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "E:\ComfyUI_windows_portable-b\ComfyUI\custom_nodes\ComfyUI-SUPIR\SUPIR\modules\SUPIR_v0.py", line 147, in forward x = self.attn(x, context) ^^^^^^^^^^^^^^^^^^^^^ File "E:\ComfyUI_windows_portable-b\python_embeded\Lib\site-packages\torch\nn\modules\module.py", line 1511, in _wrapped_call_impl return self._call_impl(*args, *kwargs) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "E:\ComfyUI_windows_portable-b\python_embeded\Lib\site-packages\torch\nn\modules\module.py", line 1520, in _call_impl return forward_call(args, kwargs) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "E:\ComfyUI_windows_portable-b\ComfyUI\custom_nodes\ComfyUI-SUPIR\sgm\modules\attention.py", line 365, in forward out = xformers.ops.memory_efficient_attention( ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "E:\ComfyUI_windows_portable-b\python_embeded\Lib\site-packages\xformers\ops\fmha__init.py", line 268, in memory_efficient_attention return _memory_efficient_attention( ^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "E:\ComfyUI_windows_portable-b\python_embeded\Lib\site-packages\xformers\ops\fmha\init.py", line 387, in _memory_efficient_attention return _memory_efficient_attention_forward( ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "E:\ComfyUI_windows_portable-b\python_embeded\Lib\site-packages\xformers\ops\fmha\init__.py", line 403, in _memory_efficient_attention_forward op = _dispatch_fw(inp, False) ^^^^^^^^^^^^^^^^^^^^^^^^ File "E:\ComfyUI_windows_portable-b\python_embeded\Lib\site-packages\xformers\ops\fmha\dispatch.py", line 125, in _dispatch_fw return _run_priority_list( ^^^^^^^^^^^^^^^^^^^ File "E:\ComfyUI_windows_portable-b\python_embeded\Lib\site-packages\xformers\ops\fmha\dispatch.py", line 65, in _run_priority_list raise NotImplementedError(msg)

kijai commented 6 months ago

Did you try simply running without xformers? It shouldn't be needed.

mr-bob-chang commented 6 months ago

Did you try simply running without xformers? It shouldn't be needed.

I uninstall xFormers by typing cmd into E:\ComfyUI_windows_portable-b\python_embeded and then pip uninstall xFormers, but the same error message still appears

kijai commented 6 months ago

Did you try simply running without xformers? It shouldn't be needed.

I uninstall xFormers by typing cmd into E:\ComfyUI_windows_portable-b\python_embeded and then pip uninstall xFormers, but the same error message still appears

The folder is right, but you do need to run the pip commands specifically with the python.exe in that folder, for example: python.exe -m pip uninstall xformers

mr-bob-chang commented 6 months ago

Did you try simply running without xformers? It shouldn't be needed.

I uninstall xFormers by typing cmd into E:\ComfyUI_windows_portable-b\python_embeded and then pip uninstall xFormers, but the same error message still appears

The folder is right, but you do need to run the pip commands specifically with the python.exe in that folder, for example: python.exe -m pip uninstall xformers

OK, problem solved, but will uninstalling xFormers affect other plugins? What do xFormers do?

kijai commented 6 months ago

Did you try simply running without xformers? It shouldn't be needed.

I uninstall xFormers by typing cmd into E:\ComfyUI_windows_portable-b\python_embeded and then pip uninstall xFormers, but the same error message still appears

The folder is right, but you do need to run the pip commands specifically with the python.exe in that folder, for example: python.exe -m pip uninstall xformers

OK, problem solved, but will uninstalling xFormers affect other plugins? What do xFormers do?

It's used for attention optimization for nvidia GPUs, it was a bigger deal before Pytorch added their own similar method in torch 2.0, these days the difference is usually minimal. There may be some custom nodes relying on it, ComfyUI itself doesn't even come with it anymore.

If you do have nvidia gpu and want it back, you can always try installing it like this: python.exe -m pip uninstall xformers --no-deps

mr-bob-chang commented 6 months ago

Did you try simply running without xformers? It shouldn't be needed.

I uninstall xFormers by typing cmd into E:\ComfyUI_windows_portable-b\python_embeded and then pip uninstall xFormers, but the same error message still appears

The folder is right, but you do need to run the pip commands specifically with the python.exe in that folder, for example: python.exe -m pip uninstall xformers

OK, problem solved, but will uninstalling xFormers affect other plugins? What do xFormers do?

It's used for attention optimization for nvidia GPUs, it was a bigger deal before Pytorch added their own similar method in torch 2.0, these days the difference is usually minimal. There may be some custom nodes relying on it, ComfyUI itself doesn't even come with it anymore.

If you do have nvidia gpu and want it back, you can always try installing it like this: python.exe -m pip uninstall xformers --no-deps

Thank you for your reply. Your work has really helped me a lot. Thank you very much!

giusparsifal commented 2 months ago

Hi, even if I try to unistall xformers from python_embeded folder with the command you typed, it's always go on the python main installation...

G:\ComfyUI_windows_portable>.\python_embeded\python.exe -m pip uninstall xformers Found existing installation: xformers 0.0.27.post2+cu118 Uninstalling xformers-0.0.27.post2+cu118: Would remove: c:\users\pinoa\appdata\roaming\python\python311\site-packages\xformers-0.0.27.post2+cu118.dist-info* c:\users\pinoa\appdata\roaming\python\python311\site-packages\xformers* Proceed (Y/n)?

azelylmz commented 2 months ago

Hey, i'm lost in this error :/ could someone help pls?

NotImplementedError: No operator found for memory_efficient_attention_forward with inputs: query : shape=(2, 1024, 10, 64) (torch.float16) key : shape=(2, 1024, 10, 64) (torch.float16) value : shape=(2, 1024, 10, 64) (torch.float16) attn_bias : <class 'NoneType'> p : 0.0 decoderF is not supported because: xFormers wasn't build with CUDA support attn_bias type is <class 'NoneType'> operator wasn't built - see python -m xformers.info for more info flshattF@0.0.0 is not supported because: xFormers wasn't build with CUDA support operator wasn't built - see python -m xformers.info for more info tritonflashattF is not supported because: xFormers wasn't build with CUDA support operator wasn't built - see python -m xformers.info for more info triton is not available cutlassF is not supported because: xFormers wasn't build with CUDA support operator wasn't built - see python -m xformers.info for more info smallkF is not supported because: max(query.shape[-1] != value.shape[-1]) > 32 xFormers wasn't build with CUDA support dtype=torch.float16 (supported: {torch.float32}) operator wasn't built - see python -m xformers.info for more info unsupported embed per head: 64

kijai commented 2 months ago

Hey, i'm lost in this error :/ could someone help pls?

NotImplementedError: No operator found for memory_efficient_attention_forward with inputs: query : shape=(2, 1024, 10, 64) (torch.float16) key : shape=(2, 1024, 10, 64) (torch.float16) value : shape=(2, 1024, 10, 64) (torch.float16) attn_bias : <class 'NoneType'> p : 0.0 decoderF is not supported because: xFormers wasn't build with CUDA support attn_bias type is <class 'NoneType'> operator wasn't built - see python -m xformers.info for more info flshattF@0.0.0 is not supported because: xFormers wasn't build with CUDA support operator wasn't built - see python -m xformers.info for more info tritonflashattF is not supported because: xFormers wasn't build with CUDA support operator wasn't built - see python -m xformers.info for more info triton is not available cutlassF is not supported because: xFormers wasn't build with CUDA support operator wasn't built - see python -m xformers.info for more info smallkF is not supported because: max(query.shape[-1] != value.shape[-1]) > 32 xFormers wasn't build with CUDA support dtype=torch.float16 (supported: {torch.float32}) operator wasn't built - see python -m xformers.info for more info unsupported embed per head: 64

Uninstall xformers with pip uninstall xformers, if you need it for something else then re-install with pip install xformers --no-deps.

If you use windows portable install, those commands need to be run in the python_embeded -folder like so:

python.exe -m pip uninstall xformers

azelylmz commented 2 months ago

Hey, i'm lost in this error :/ could someone help pls? NotImplementedError: No operator found for memory_efficient_attention_forward with inputs: query : shape=(2, 1024, 10, 64) (torch.float16) key : shape=(2, 1024, 10, 64) (torch.float16) value : shape=(2, 1024, 10, 64) (torch.float16) attn_bias : <class 'NoneType'> p : 0.0 decoderF is not supported because: xFormers wasn't build with CUDA support attn_bias type is <class 'NoneType'> operator wasn't built - see python -m xformers.info for more info flshattF@0.0.0 is not supported because: xFormers wasn't build with CUDA support operator wasn't built - see python -m xformers.info for more info tritonflashattF is not supported because: xFormers wasn't build with CUDA support operator wasn't built - see python -m xformers.info for more info triton is not available cutlassF is not supported because: xFormers wasn't build with CUDA support operator wasn't built - see python -m xformers.info for more info smallkF is not supported because: max(query.shape[-1] != value.shape[-1]) > 32 xFormers wasn't build with CUDA support dtype=torch.float16 (supported: {torch.float32}) operator wasn't built - see python -m xformers.info for more info unsupported embed per head: 64

Uninstall xformers with pip uninstall xformers, if you need it for something else then re-install with pip install xformers --no-deps.

If you use windows portable install, those commands need to be run in the python_embeded -folder like so:

python.exe -m pip uninstall xformers

thank you, i did, but getting still same error :/

kijai commented 2 months ago

Hey, i'm lost in this error :/ could someone help pls? NotImplementedError: No operator found for memory_efficient_attention_forward with inputs: query : shape=(2, 1024, 10, 64) (torch.float16) key : shape=(2, 1024, 10, 64) (torch.float16) value : shape=(2, 1024, 10, 64) (torch.float16) attn_bias : <class 'NoneType'> p : 0.0 decoderF is not supported because: xFormers wasn't build with CUDA support attn_bias type is <class 'NoneType'> operator wasn't built - see python -m xformers.info for more info flshattF@0.0.0 is not supported because: xFormers wasn't build with CUDA support operator wasn't built - see python -m xformers.info for more info tritonflashattF is not supported because: xFormers wasn't build with CUDA support operator wasn't built - see python -m xformers.info for more info triton is not available cutlassF is not supported because: xFormers wasn't build with CUDA support operator wasn't built - see python -m xformers.info for more info smallkF is not supported because: max(query.shape[-1] != value.shape[-1]) > 32 xFormers wasn't build with CUDA support dtype=torch.float16 (supported: {torch.float32}) operator wasn't built - see python -m xformers.info for more info unsupported embed per head: 64

Uninstall xformers with pip uninstall xformers, if you need it for something else then re-install with pip install xformers --no-deps. If you use windows portable install, those commands need to be run in the python_embeded -folder like so: python.exe -m pip uninstall xformers

thank you, i did, but getting still same error :/

Make sure you use the python used in your Comfy install, if you get the same error then it wasn't uninstalled from correct python environment.

cives commented 1 week ago

Hey, i'm lost in this error :/ could someone help pls? NotImplementedError: No operator found for memory_efficient_attention_forward with inputs: query : shape=(2, 1024, 10, 64) (torch.float16) key : shape=(2, 1024, 10, 64) (torch.float16) value : shape=(2, 1024, 10, 64) (torch.float16) attn_bias : <class 'NoneType'> p : 0.0 decoderF is not supported because: xFormers wasn't build with CUDA support attn_bias type is <class 'NoneType'> operator wasn't built - see python -m xformers.info for more info flshattF@0.0.0 is not supported because: xFormers wasn't build with CUDA support operator wasn't built - see python -m xformers.info for more info tritonflashattF is not supported because: xFormers wasn't build with CUDA support operator wasn't built - see python -m xformers.info for more info triton is not available cutlassF is not supported because: xFormers wasn't build with CUDA support operator wasn't built - see python -m xformers.info for more info smallkF is not supported because: max(query.shape[-1] != value.shape[-1]) > 32 xFormers wasn't build with CUDA support dtype=torch.float16 (supported: {torch.float32}) operator wasn't built - see python -m xformers.info for more info unsupported embed per head: 64

Uninstall xformers with pip uninstall xformers, if you need it for something else then re-install with pip install xformers --no-deps. If you use windows portable install, those commands need to be run in the python_embeded -folder like so: python.exe -m pip uninstall xformers

thank you, i did, but getting still same error :/

I think you must reference the right Python installation by executing within the folder python_embedded the command as follows: .\python -m pip uninstall (... etc.)