AUTOMATIC1111 / stable-diffusion-webui

Stable Diffusion web UI
GNU Affero General Public License v3.0
135.55k stars 25.88k forks source link

[Bug]: [AMD GPU] Adding --xformers breaks instalation (linux) #10039

Open tankersss opened 1 year ago

tankersss commented 1 year ago

Is there an existing issue for this?

What happened?

When adding --xformers flag, I get an error

Traceback (most recent call last):
  File "/home/tankers/stable-diffusion-webui/launch.py", line 352, in <module>
    prepare_environment()
  File "/home/tankers/stable-diffusion-webui/launch.py", line 257, in prepare_environment
    run_python("import torch; assert torch.cuda.is_available(), 'Torch is not able to use GPU; add --skip-torch-cuda-test to COMMANDLINE_ARGS variable to disable this check'")
  File "/home/tankers/stable-diffusion-webui/launch.py", line 120, in run_python
    return run(f'"{python}" -c "{code}"', desc, errdesc)
  File "/home/tankers/stable-diffusion-webui/launch.py", line 96, in run
    raise RuntimeError(message)
RuntimeError: Error running command.
Command: "/home/tankers/stable-diffusion-webui/venv/bin/python3.10" -c "import torch; assert torch.cuda.is_available(), 'Torch is not able to use GPU; add --skip-torch-cuda-test to COMMANDLINE_ARGS variable to disable this check'"
Error code: 1
stdout: <empty>
stderr: Traceback (most recent call last):
  File "<string>", line 1, in <module>
AssertionError: Torch is not able to use GPU; add --skip-torch-cuda-test to COMMANDLINE_ARGS variable to disable this check

It persists even after removing the flag.

After adding --skip-torch-cuda-test I get another error:

Launching Web UI with arguments: --listen --xformers --precision full --no-half --skip-torch-cuda-test
/home/tankers/stable-diffusion-webui/venv/lib64/python3.10/site-packages/torchvision/io/image.py:13: UserWarning: Failed to load image Python extension: libc10_hip.so: cannot open shared object file: No such file or directory
  warn(f"Failed to load image Python extension: {e}")
Segmentation fault (core dumped)

Steps to reproduce the problem

  1. Clean install of Fedora Workstation
  2. Install python 3.10 with dnf
  3. Set python3.10 in webui-user.sh
  4. Set flags --precision full --no-half
  5. Start ./webui.sh
  6. It works fine
  7. Stop ./webui.sh
  8. Add --xformers
  9. Start ./webui.sh
  10. Error

What should have happened?

There should have been no error, or mention of Cuda, since it's on Rocm.

Commit where the problem happens

5ab7f213bec2f816f9c5644becb32eb72c8ffb89

What platforms do you use to access the UI ?

Linux

What browsers do you use to access the UI ?

No response

Command Line Arguments

--precision full --no-half --xformers

List of extensions

No

Console logs

See above

Additional information

No response

YHD233 commented 1 year ago

Xfomers only applies to nvidia gpu

tankersss commented 1 year ago

Leaving it open, since it should not crash, there should be information about it working only on Nvidia, since there is information that --xformers are not enabled even while using Rocm, and it shouldn't require reinstalling.

ClashSAN commented 1 year ago

in the xformers wiki page there is:

"This optimization is only available for nvidia gpus"

there is information that --xformers are not enabled

yes, that message is always shown in console when you do not apply xformers optimization, even with nvidia

mudjello commented 11 months ago

https://github.com/facebookresearch/xformers/issues/807#issuecomment-1653834071

My guess is they can't get HIPIFY working on the xformers source or its personal reasons for dragging their feet.