Stability-AI / StableSwarmUI

StableSwarmUI, A Modular Stable Diffusion Web-User-Interface, with an emphasis on making powertools easily accessible, high performance, and extensibility.
MIT License
4.26k stars 339 forks source link

Automate Support For Old AMD GPUs (DirectML) #23

Open hdawod opened 11 months ago

hdawod commented 11 months ago

Hi, I followed the default installation process. but when run StableSwarmUI I receive error message Some backends have errored on the server. Check the server logs for details. I have MSI Alpha 15 | Ryzen 7 with AMD Radeon RX. Does StableSwarmUI support my AMD gpu? Any help please?

01 02

hdawod commented 11 months ago

the platform is windows 10

mcmonkey4eva commented 11 months ago

Check the console window, there should be an error message there

midihex commented 11 months ago

It should be a full path EG C:\Comfy\ComfyUI\main.py, give that a shot

hdawod commented 11 months ago

@mcmonkey4eva I checked the console window and It shows an error regarding the GPU. I have 8 Gb AMD gpu but the error asking for Nvidia @midihex This drive me crazy. I changed the path with no luck 03

midihex commented 11 months ago

On the comfyui GitHub it mentions some extra steps to make cui work with amd,

pip install torch-directml And add --directml in the backend extra Args box

But do check this information for yourself first.

I'm assuming auto AMD config isn't in the current scope of Swarm, it's more of a comfyui setup issue

hdawod commented 11 months ago

Thanks @midihex I install directml and add the script in the ARGS box but still show some odd error. ModuleNotFoundError: No module named 'torch_directml' I tried to solve this error before but failed. sorry if I give you headache! here is the console window message image

midihex commented 11 months ago

Might be worth asking for help over on the comfyui hub

Also I didn't realise that you'd installed comfyui via the swarm install process so your original link that started with dlbackend/ was fine. My example was for if you had comfyui already installed.

But still. Error looks to be comfyui related When you installed torch direct ml -no errors?

LinuxAITottiLabs commented 11 months ago

Thanks @midihex I install directml and add the script in the ARGS box but still show some odd error. ModuleNotFoundError: No module named 'torch_directml' I tried to solve this error before but failed. sorry if I give you headache! here is the console window message image

hello, i think i know what you are trying to do. I am on mac but i will give you suggestion and try it: 1- go to the file dlbackend/ComfyUI. make sure you activate the venv 2-source venv/bin/activate 3-pip install -r requirements.txt or whatever you want to install in your environment 4-launch your script 5-put StartScript path to main.py 6-put ExtraArgs i use --normalvram

let me know if that work for you

hdawod commented 11 months ago

Hi @tottiaa Could you explain more how to do it? I tried to follow the steps but venv is not there in comfyui folder. I installed virtual environment and venv folder become there. I ran activate file by double click but nothing happen. steps 3 to 5 can't do..

mcmonkey4eva commented 11 months ago

There isn't a venv in the default comfy install. You'll want to

EDIT: and as said above, add --directml in the backend Extra Args box

EDIT2: according to comfy, directml requires Torch 2.0 (as opposed to current 2.1) so you'll also need to backdate torch.

mcmonkey4eva commented 6 months ago

Update: I now have the logic to automatically install with AMD compatibility... in theory.

In practice, on Windows, torch-directml is a deeply broken library that's basically unsupported and so the installer won't actually work until/unless they fix it. Notably Comfy uses Python 3.11 or 3.12, and torch 2.1 - directml requires Python 3.10 and torch 2.0. Why such strict version locks, idek. But they're in the way of compatibility for any installation method that isn't manual hackery.

For Linux users I expect it should just work via rocm (untested).

terry45 commented 5 months ago

I can confirm the AMD install doesn't work, installs the nvidia torch instead of the rocm version

mcmonkey4eva commented 1 month ago

the Windows installer I tested and I managed to get to actually work - barely. On an RX 7900 XT (20 GiB VRAM) it installs and works, but you need to disable previews and it fills all 20 GiB of VRAM and takes forever to generate anything in SDXL. I hate it.

zluda is probably a better option but that has a pile of its own complications. Argh.