lshqqytiger / stable-diffusion-webui-amdgpu

Stable Diffusion web UI
GNU Affero General Public License v3.0
1.81k stars 186 forks source link

[Bug]: RuntimeError: Torch is not able to use GPU;... etc #544

Open SourSnappleOG opened 5 days ago

SourSnappleOG commented 5 days ago

Checklist

What happened?

When following the instructions for downloading this for AMD:

git clone https://github.com/lshqqytiger/stable-diffusion-webui-directml && cd stable-diffusion-webui-directml && git submodule init && git submodule update

then clicking the webui-user.bat (for windows), it simply states that Torch isn't able to use the GPU and recommends using --skip-torch-cuda-test. This should be the main reason why we use this branch (to not have the AMD integration issue). Any Ideas?

Steps to reproduce the problem

use the link above (create a new folder with the clone, etc.)

click on webui-user.bat

What should have happened?

It should have ran and opened the ui

What browsers do you use to access the UI ?

No response

Sysinfo

... it won't even open...

Console logs

code\stable-diffusion-webui-directml\venv\lib\site-packages (22.2.1)
Collecting pip
  Using cached pip-24.2-py3-none-any.whl (1.8 MB)
Installing collected packages: pip
  Attempting uninstall: pip
    Found existing installation: pip 22.2.1
    Uninstalling pip-22.2.1:
      Successfully uninstalled pip-22.2.1
Successfully installed pip-24.2
venv "C:\Users\Drew\Code\stable-diffusion-webui-directml\venv\Scripts\Python.exe"
Python 3.10.6 (tags/v3.10.6:9c7b4bd, Aug  1 2022, 21:53:49) [MSC v.1932 64 bit (AMD64)]
Version: v1.10.1-amd-11-gefddd05e
Commit hash: efddd05e11d9cc5339a41192457e6ff8ad06ae00
Installing torch and torchvision
Looking in indexes: https://pypi.org/simple, https://download.pytorch.org/whl/cu121
Collecting torch==2.3.1
  Using cached https://download.pytorch.org/whl/cu121/torch-2.3.1%2Bcu121-cp310-cp310-win_amd64.whl (2423.5 MB)
Collecting torchvision
  Using cached https://download.pytorch.org/whl/cu121/torchvision-0.19.1%2Bcu121-cp310-cp310-win_amd64.whl (5.8 MB)
Collecting filelock (from torch==2.3.1)
  Using cached filelock-3.16.1-py3-none-any.whl.metadata (2.9 kB)
Collecting typing-extensions>=4.8.0 (from torch==2.3.1)
  Using cached typing_extensions-4.12.2-py3-none-any.whl.metadata (3.0 kB)
Collecting sympy (from torch==2.3.1)
  Using cached sympy-1.13.3-py3-none-any.whl.metadata (12 kB)
Collecting networkx (from torch==2.3.1)
  Using cached networkx-3.3-py3-none-any.whl.metadata (5.1 kB)
Collecting jinja2 (from torch==2.3.1)
  Using cached jinja2-3.1.4-py3-none-any.whl.metadata (2.6 kB)
Collecting fsspec (from torch==2.3.1)
  Using cached fsspec-2024.9.0-py3-none-any.whl.metadata (11 kB)
Collecting mkl<=2021.4.0,>=2021.1.1 (from torch==2.3.1)
  Using cached https://download.pytorch.org/whl/mkl-2021.4.0-py2.py3-none-win_amd64.whl (228.5 MB)
Collecting numpy (from torchvision)
  Using cached numpy-2.1.2-cp310-cp310-win_amd64.whl.metadata (59 kB)
INFO: pip is looking at multiple versions of torchvision to determine which version is compatible with other requirements. This could take a while.
Collecting torchvision
  Using cached torchvision-0.19.1-cp310-cp310-win_amd64.whl.metadata (6.1 kB)
  Using cached https://download.pytorch.org/whl/cu121/torchvision-0.19.0%2Bcu121-cp310-cp310-win_amd64.whl (5.8 MB)
  Using cached torchvision-0.19.0-1-cp310-cp310-win_amd64.whl.metadata (6.1 kB)
Collecting numpy<2 (from torchvision)
  Using cached numpy-1.26.4-cp310-cp310-win_amd64.whl.metadata (61 kB)
Collecting torchvision
  Using cached https://download.pytorch.org/whl/cu121/torchvision-0.18.1%2Bcu121-cp310-cp310-win_amd64.whl (5.7 MB)
Collecting pillow!=8.3.*,>=5.3.0 (from torchvision)
  Using cached pillow-10.4.0-cp310-cp310-win_amd64.whl.metadata (9.3 kB)
Collecting intel-openmp==2021.* (from mkl<=2021.4.0,>=2021.1.1->torch==2.3.1)
  Using cached https://download.pytorch.org/whl/intel_openmp-2021.4.0-py2.py3-none-win_amd64.whl (3.5 MB)
Collecting tbb==2021.* (from mkl<=2021.4.0,>=2021.1.1->torch==2.3.1)
  Using cached tbb-2021.13.1-py3-none-win_amd64.whl.metadata (1.1 kB)
Collecting MarkupSafe>=2.0 (from jinja2->torch==2.3.1)
  Using cached MarkupSafe-3.0.1-cp310-cp310-win_amd64.whl.metadata (4.1 kB)
Collecting mpmath<1.4,>=1.1.0 (from sympy->torch==2.3.1)
  Using cached https://download.pytorch.org/whl/mpmath-1.3.0-py3-none-any.whl (536 kB)
Using cached tbb-2021.13.1-py3-none-win_amd64.whl (286 kB)
Using cached pillow-10.4.0-cp310-cp310-win_amd64.whl (2.6 MB)
Using cached typing_extensions-4.12.2-py3-none-any.whl (37 kB)
Using cached filelock-3.16.1-py3-none-any.whl (16 kB)
Using cached fsspec-2024.9.0-py3-none-any.whl (179 kB)
Using cached jinja2-3.1.4-py3-none-any.whl (133 kB)
Using cached networkx-3.3-py3-none-any.whl (1.7 MB)
Using cached numpy-2.1.2-cp310-cp310-win_amd64.whl (12.9 MB)
Using cached sympy-1.13.3-py3-none-any.whl (6.2 MB)
Using cached MarkupSafe-3.0.1-cp310-cp310-win_amd64.whl (15 kB)
Installing collected packages: tbb, mpmath, intel-openmp, typing-extensions, sympy, pillow, numpy, networkx, mkl, MarkupSafe, fsspec, filelock, jinja2, torch, torchvision
Successfully installed MarkupSafe-3.0.1 filelock-3.16.1 fsspec-2024.9.0 intel-openmp-2021.4.0 jinja2-3.1.4 mkl-2021.4.0 mpmath-1.3.0 networkx-3.3 numpy-2.1.2 pillow-10.4.0 sympy-1.13.3 tbb-2021.13.1 torch-2.3.1+cu121 torchvision-0.18.1+cu121 typing-extensions-4.12.2
Traceback (most recent call last):

git clone https://github.com/lshqqytiger/stable-diffusion-webui-directml && cd stable-diffusion-webui-directml && git submodule init && git submodule update

Additional information

No response

CS1o commented 1 day ago

Hey, open up a cmd and run pip cache purge Then delete the venv folder and relaunch the webui-user.bat

If that wont help checkout my install Guides: https://github.com/CS1o/Stable-Diffusion-Info/wiki/Webui-Installation-Guides

And provide a full cmd log if you get an error again.

lshqqytiger commented 1 day ago

You need to add --use-zluda or --use-directml for AMDGPUs. https://github.com/lshqqytiger/stable-diffusion-webui-amdgpu?tab=readme-ov-file#installation-and-running