AUTOMATIC1111 / stable-diffusion-webui

Stable Diffusion web UI
GNU Affero General Public License v3.0
131.72k stars 25.28k forks source link

Torch is not able to see GPU #15614

Open marisaa-kirisame opened 4 weeks ago

marisaa-kirisame commented 4 weeks ago

Checklist

What happened?

Upon installing webui on Arch Linux, running ./webui.sh results in "Torch is not able to see GPU"—I am on an NVIDIA RTX 4070. It used to work before.

Steps to reproduce the problem

  1. go to user home folder
  2. git clone the repository
  3. chmod +x webui.sh
  4. ./webui.sh

What should have happened?

WebUI should have started up normally

What browsers do you use to access the UI ?

Mozilla Firefox

Sysinfo

Could not obtain SysInfo.

Traceback (most recent call last):
  File "/home/angel/stable-diffusion-webui/launch.py", line 48, in <module>
    main()
  File "/home/angel/stable-diffusion-webui/launch.py", line 29, in main
    filename = launch_utils.dump_sysinfo()
  File "/home/angel/stable-diffusion-webui/modules/launch_utils.py", line 473, in dump_sysinfo
    from modules import sysinfo
  File "/home/angel/stable-diffusion-webui/modules/sysinfo.py", line 8, in <module>
    import psutil
ModuleNotFoundError: No module named 'psutil'

Console logs

################################################################
Install script for stable-diffusion + Web UI
Tested on Debian 11 (Bullseye), Fedora 34+ and openSUSE Leap 15.4 or newer.
################################################################

################################################################
Running on angel user
################################################################

################################################################
Repo already cloned, using it as install directory
################################################################

################################################################
Create and activate python venv
################################################################

################################################################
Launching launch.py...
################################################################
glibc version is 2.39
Check TCMalloc: libtcmalloc_minimal.so.4
./webui.sh: line 251: bc: command not found
./webui.sh: line 251: [: -eq: unary operator expected
libtcmalloc_minimal.so.4 is linked with libc.so,execute LD_PRELOAD=/usr/lib/libtcmalloc_minimal.so.4
Python 3.10.14 (main, Apr 23 2024, 22:15:36) [GCC 13.2.1 20230801]
Version: v1.9.3
Commit hash: 1c0a0c4c26f78c32095ebc7f8af82f5c04fca8c0
Installing torch and torchvision
Looking in indexes: https://download.pytorch.org/whl/nightly/rocm6.0
Collecting torch
  Downloading https://download.pytorch.org/whl/nightly/rocm6.0/torch-2.4.0.dev20240423%2Brocm6.0-cp310-cp310-linux_x86_64.whl (2195.9 MB)
     ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 2.2/2.2 GB 1.7 MB/s eta 0:00:00
Collecting torchvision
  Downloading https://download.pytorch.org/whl/nightly/rocm6.0/torchvision-0.19.0.dev20240423%2Brocm6.0-cp310-cp310-linux_x86_64.whl (65.9 MB)
     ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 65.9/65.9 MB 36.1 MB/s eta 0:00:00
Collecting jinja2
  Downloading https://download.pytorch.org/whl/nightly/Jinja2-3.1.3-py3-none-any.whl (133 kB)
     ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 133.2/133.2 kB 10.3 MB/s eta 0:00:00
Collecting filelock
  Downloading https://download.pytorch.org/whl/nightly/filelock-3.13.1-py3-none-any.whl (11 kB)
Collecting fsspec
  Downloading https://download.pytorch.org/whl/nightly/fsspec-2024.2.0-py3-none-any.whl (170 kB)
     ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 170.9/170.9 kB 14.7 MB/s eta 0:00:00
Collecting pytorch-triton-rocm==3.0.0+0a22a91d04
  Downloading https://download.pytorch.org/whl/nightly/pytorch_triton_rocm-3.0.0%2B0a22a91d04-cp310-cp310-linux_x86_64.whl (234.3 MB)
     ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 234.3/234.3 MB 13.6 MB/s eta 0:00:00
Collecting typing-extensions>=4.8.0
  Downloading https://download.pytorch.org/whl/nightly/typing_extensions-4.8.0-py3-none-any.whl (31 kB)
Collecting networkx
  Downloading https://download.pytorch.org/whl/nightly/networkx-3.2.1-py3-none-any.whl (1.6 MB)
     ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 1.6/1.6 MB 71.5 MB/s eta 0:00:00
Collecting sympy
  Downloading https://download.pytorch.org/whl/nightly/sympy-1.12-py3-none-any.whl (5.7 MB)
     ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 5.7/5.7 MB 74.3 MB/s eta 0:00:00
Collecting numpy
  Downloading https://download.pytorch.org/whl/nightly/numpy-1.26.4-cp310-cp310-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (18.2 MB)
     ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 18.2/18.2 MB 69.4 MB/s eta 0:00:00
Collecting pillow!=8.3.*,>=5.3.0
  Downloading https://download.pytorch.org/whl/nightly/Pillow-9.3.0-cp310-cp310-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (3.2 MB)
     ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 3.2/3.2 MB 68.7 MB/s eta 0:00:00
Collecting MarkupSafe>=2.0
  Downloading https://download.pytorch.org/whl/nightly/MarkupSafe-2.1.5-cp310-cp310-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (25 kB)
Collecting mpmath>=0.19
  Downloading https://download.pytorch.org/whl/nightly/mpmath-1.2.1-py3-none-any.whl (532 kB)
     ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 532.6/532.6 kB 33.9 MB/s eta 0:00:00
Installing collected packages: mpmath, typing-extensions, sympy, pillow, numpy, networkx, MarkupSafe, fsspec, filelock, pytorch-triton-rocm, jinja2, torch, torchvision
Successfully installed MarkupSafe-2.1.5 filelock-3.13.1 fsspec-2024.2.0 jinja2-3.1.3 mpmath-1.2.1 networkx-3.2.1 numpy-1.26.4 pillow-9.3.0 pytorch-triton-rocm-3.0.0+0a22a91d04 sympy-1.12 torch-2.4.0.dev20240423+rocm6.0 torchvision-0.19.0.dev20240423+rocm6.0 typing-extensions-4.8.0
WARNING: There was an error checking the latest version of pip.
Traceback (most recent call last):
  File "/home/angel/stable-diffusion-webui/launch.py", line 48, in <module>
    main()
  File "/home/angel/stable-diffusion-webui/launch.py", line 39, in main
    prepare_environment()
  File "/home/angel/stable-diffusion-webui/modules/launch_utils.py", line 386, in prepare_environment
    raise RuntimeError(
RuntimeError: Torch is not able to use GPU; add --skip-torch-cuda-test to COMMANDLINE_ARGS variable to disable this check

Additional information

I have tried deleting venv multiple times. I have installed Python 3.10 using AUR.

praksharma commented 3 weeks ago

Same error. Just now. It was working last week.

ivanp7 commented 3 weeks ago

I confirm. It was working before, but stopped after update.

dklange commented 3 weeks ago

Same error as this but with the path different.

Traceback (most recent call last): File "/home/angel/stable-diffusion-webui/launch.py", line 48, in main() File "/home/angel/stable-diffusion-webui/launch.py", line 39, in main prepare_environment() File "/home/angel/stable-diffusion-webui/modules/launch_utils.py", line 386, in prepare_environment raise RuntimeError( RuntimeError: Torch is not able to use GPU; add --skip-torch-cuda-test to COMMANDLINE_ARGS

Clean install of windows 11 with an RTX3080 and an Intel chip.

Installed git, installed python, git cloned, and I get the same error.

dklange commented 3 weeks ago

Same error as this but with the path different.

Traceback (most recent call last): File "/home/angel/stable-diffusion-webui/launch.py", line 48, in main() File "/home/angel/stable-diffusion-webui/launch.py", line 39, in main prepare_environment() File "/home/angel/stable-diffusion-webui/modules/launch_utils.py", line 386, in prepare_environment raise RuntimeError( RuntimeError: Torch is not able to use GPU; add --skip-torch-cuda-test to COMMANDLINE_ARGS

Clean install of windows 11 with an RTX3080 and an Intel chip.

Installed git, installed python, git cloned, and I get the same error.

I reinstalled the latest nvidia driver and it worked. I don't know if that was the issue or not though.

marisaa-kirisame commented 3 weeks ago

Same error as this but with the path different. Traceback (most recent call last): File "/home/angel/stable-diffusion-webui/launch.py", line 48, in main() File "/home/angel/stable-diffusion-webui/launch.py", line 39, in main prepare_environment() File "/home/angel/stable-diffusion-webui/modules/launch_utils.py", line 386, in prepare_environment raise RuntimeError( RuntimeError: Torch is not able to use GPU; add --skip-torch-cuda-test to COMMANDLINE_ARGS Clean install of windows 11 with an RTX3080 and an Intel chip. Installed git, installed python, git cloned, and I get the same error.

I reinstalled the latest nvidia driver and it worked. I don't know if that was the issue or not though.

Still doesn't work on Arch after updating system and restarting. This seems to be a webui-specific issue, because ComfyUI works flawlessly.

ALIJ0N commented 3 weeks ago

Same error as this but with the path different.

Traceback (most recent call last): File "/home/angel/stable-diffusion-webui/launch.py", line 48, in main() File "/home/angel/stable-diffusion-webui/launch.py", line 39, in main prepare_environment() File "/home/angel/stable-diffusion-webui/modules/launch_utils.py", line 386, in prepare_environment raise RuntimeError( RuntimeError: Torch is not able to use GPU; add --skip-torch-cuda-test to COMMANDLINE_ARGS

Clean install of windows 11 with an RTX3080 and an Intel chip.

Installed git, installed python, git cloned, and I get the same error.

bro just use directml, I find this video from depth of internet https://youtu.be/zWNJfP-wPIo

Bewinxed commented 2 weeks ago

If you installed Cudnn or the latest Nvidia drivers, you may have been updated to CUDA 12.4

Follow this to install the latest nightly torch and everything should work 👍

marisaa-kirisame commented 2 weeks ago

Same error as this but with the path different. Traceback (most recent call last): File "/home/angel/stable-diffusion-webui/launch.py", line 48, in main() File "/home/angel/stable-diffusion-webui/launch.py", line 39, in main prepare_environment() File "/home/angel/stable-diffusion-webui/modules/launch_utils.py", line 386, in prepare_environment raise RuntimeError( RuntimeError: Torch is not able to use GPU; add --skip-torch-cuda-test to COMMANDLINE_ARGS Clean install of windows 11 with an RTX3080 and an Intel chip. Installed git, installed python, git cloned, and I get the same error.

bro just use directml, I find this video from depth of internet https://youtu.be/zWNJfP-wPIo

This is a tutorial for AMD Windows users (which I am not, hence I do not have an activation batch script and don't need to omit the CUDA check—I have a CUDA-supported graphics card), also pip can't seem to find torch-directml.

marisaa-kirisame commented 2 weeks ago

If you installed Cudnn or the latest Nvidia drivers, you may have been updated to CUDA 12.4

Follow this to install the latest nightly torch and everything should work 👍

Seems to not have done anything, error persists

austinksmith commented 1 week ago

I can confirm this issue, I am running the latest nvidia drivers 470.238.06 and cuda 11.4 support is available

`+-----------------------------------------------------------------------------+ | NVIDIA-SMI 470.239.06 Driver Version: 470.239.06 CUDA Version: 11.4 | |-------------------------------+----------------------+----------------------+ | GPU Name Persistence-M| Bus-Id Disp.A | Volatile Uncorr. ECC | | Fan Temp Perf Pwr:Usage/Cap| Memory-Usage | GPU-Util Compute M. | | | | MIG M. | |===============================+======================+======================| | 0 Tesla K80 Off | 00000000:04:00.0 Off | Off | | N/A 42C P8 29W / 149W | 3MiB / 12206MiB | 0% Default | | | | N/A | +-------------------------------+----------------------+----------------------+

+-----------------------------------------------------------------------------+ | Processes: | | GPU GI CI PID Type Process name GPU Memory | | ID ID Usage | |=============================================================================| | No running processes found | +-----------------------------------------------------------------------------+`