CS1o / Stable-Diffusion-Info

Stable Diffusion Knowledge Base (Setups, Basics, Guides and more)
26 stars 0 forks source link

ComfyUI Zluda Install Guide Note - Try different Repo #3

Open pw405 opened 2 weeks ago

pw405 commented 2 weeks ago

Hi! Thanks so much for putting these guides together. I'm enjoying Flux with ComfyUI on an XTX thanks to your guide!

BUT... I had to use a different repo than the one listed in the guide. For some reason the LeagueRaINi repo wouldn't update successfully, and I could never get the missing nodes I needed for Flux to work.

However, I found if I use this repo, I can update nodes, ComfyUI itself updates correctly, and I've succesfully tested many Unets & checkpoints with it:

https://github.com/patientx/ComfyUI-Zluda

F0xiiNat0r commented 2 weeks ago

I've followed all the same steps in the guide down to the letter but I just can't get it to work, it always ends with "RuntimeError: No CUDA GPUs are available" I have an RX 7700 XT and I just can't get it working. Do you think I should try following the same steps again but use that repo instead?

Here's the full error when running the .bat file btw:

Activating virtual environment... Running main.py with --auto-launch argument... Traceback (most recent call last): File "D:\Things\SD-Zluda\ComfyUI\main.py", line 86, in import execution File "D:\Things\SD-Zluda\ComfyUI\execution.py", line 12, in import nodes File "D:\Things\SD-Zluda\ComfyUI\nodes.py", line 21, in import comfy.diffusers_load File "D:\Things\SD-Zluda\ComfyUI\comfy\diffusers_load.py", line 3, in import comfy.sd File "D:\Things\SD-Zluda\ComfyUI\comfy\sd.py", line 5, in from comfy import model_management File "D:\Things\SD-Zluda\ComfyUI\comfy\model_management.py", line 137, in total_vram = get_total_memory(get_torch_device()) / (1024 * 1024) File "D:\Things\SD-Zluda\ComfyUI\comfy\model_management.py", line 106, in get_torch_device return torch.device(torch.cuda.current_device()) File "D:\Things\SD-Zluda\ComfyUI\venv\lib\site-packages\torch\cuda__init__.py", line 778, in current_device _lazy_init() File "D:\Things\SD-Zluda\ComfyUI\venv\lib\site-packages\torch\cuda__init__.py", line 293, in _lazy_init torch._C._cuda_init() RuntimeError: No CUDA GPUs are available Deactivating virtual environment...

I could also just be SOL as a whole.

pw405 commented 2 weeks ago

I know people have got it working with a 7800XT, so it should work fine. But yes, I would just delete that entire folder and use the repo from Patient X.

Did you set the environment variables in Windows?

F0xiiNat0r commented 2 weeks ago

I know people have got it working with a 7800XT, so it should work fine. But yes, I would just delete that entire folder and use the repo from Patient X.

Did you set the environment variables in Windows?

Yes I did, but I just never got it to work, even with this alternative repo that you linked. I'm also not 100% sure if I need to do ROCm 6.1 or 5.7. probably 6.1 if I had to guess.

CS1o commented 1 week ago

However, I found if I use this repo, I can update nodes, ComfyUI itself updates correctly, and I've succesfully tested many Unets & checkpoints with it:

https://github.com/patientx/ComfyUI-Zluda

Hey, thanks for letting me know. The Repo of Patient X uses the Patches of League Raini, but without the "nodes-cudnn-patch" which prevents any custom node to use cudnn because zluda doesnt support it. But PatientX made it easy to setup.

I will add a Link to the Repo to my ComfyUI Zluda Guide and still leave my Manuall guide there too for the people who want to use ROCM HIP SDK 6.1 with ComfyUI.