Sygil-Dev / stable-diffusion

GNU Affero General Public License v3.0
1.72k stars 149 forks source link

Script for AMD? #272

Closed Apoc9512 closed 2 years ago

Apoc9512 commented 2 years ago

I followed the instructions, yet ran into the issue of not finding a CUDA/NVIDIA GPU. Is there something I can modify or some fork of the script that will work on AMD cards?

cstueckrath commented 2 years ago

read here: https://pytorch.org/get-started/locally/ -> rocm (you'll have to use pip inside of anaconda: https://stackoverflow.com/questions/41060382/using-pip-to-install-packages-to-anaconda-environment ) and here: https://rocmdocs.amd.com/en/latest/

Am Mo., 5. Sept. 2022 um 06:54 Uhr schrieb Apoc9512 < @.***>:

I followed the instructions, yet ran into the issue of not finding a CUDA/NVIDIA GPU. Is there something I can modify or some fork of the script that will work on AMD cards?

— Reply to this email directly, view it on GitHub https://github.com/hlky/stable-diffusion/issues/272, or unsubscribe https://github.com/notifications/unsubscribe-auth/AAXQDPO7EDC4VE74G7GM7FDV4V4HVANCNFSM6AAAAAAQEUJUA4 . You are receiving this because you are subscribed to this thread.Message ID: @.***>

SJDunkelman commented 2 years ago

I've installed non-CUDA using pip within conda, but it's still presenting same issue - how do I get webui.cmd to use the pip installed package?

cstueckrath commented 2 years ago

try calling device = torch.device('cuda')

What happens then? This should work with ROCm out of the box (see https://github.com/pytorch/pytorch/issues/10670)

phreeware commented 2 years ago

read here: https://pytorch.org/get-started/locally/ -> rocm (you'll have to use pip inside of anaconda: https://stackoverflow.com/questions/41060382/using-pip-to-install-packages-to-anaconda-environment ) and here: https://rocmdocs.amd.com/en/latest/ Am Mo., 5. Sept. 2022 um 06:54 Uhr schrieb Apoc9512 < @.>: I followed the instructions, yet ran into the issue of not finding a CUDA/NVIDIA GPU. Is there something I can modify or some fork of the script that will work on AMD cards? — Reply to this email directly, view it on GitHub <#272>, or unsubscribe https://github.com/notifications/unsubscribe-auth/AAXQDPO7EDC4VE74G7GM7FDV4V4HVANCNFSM6AAAAAAQEUJUA4 . You are receiving this because you are subscribed to this thread.Message ID: @.>

so this will only work on unix since ROCm is unix only? any way to get around this? would WSL be working (https://docs.microsoft.com/en-us/windows/wsl/install)?

kik4444 commented 2 years ago

Is any effort being made somewhere towards creating a Docker setup for AMD + the webui, either on this repo or a fork somewhere? I'd like to try running this on my Linux PC, but I get put off when I see how much tweaking I'll have to do with little help online compared to Nvidia. I'm confident in my knowledge of Linux and Python, but my next-to-nonexistent knowledge of A.I. makes me think I'll mess something up.

cstueckrath commented 2 years ago

WSL won't work this way. Youll have to install a Linux distribution (don't use Ubuntu 22.04, only 20.04 is supported right now). You need to have a supported GPU, too! A NAVI 22 (eg Radeon RX6700XT) is NOT supported by ROCm!

It might be possible to work around all this with manually patching and even get it to work in WSL2 with help from Microsoft: https://github.com/microsoft/antares

But this is all too much for me to dig around alone... I hope, someone can find a solution that works.

cstueckrath commented 2 years ago

you can try the stuff I wrote here: https://github.com/hlky/stable-diffusion/discussions/214#discussioncomment-3566173

I cannot test this myself because of my gpu