Closed radna0 closed 2 months ago
Just out of curiosity, why would you run zluda on Linux?
Running on Linux would allow me to spin up in instance using the script like above and also control it via ssh. I'm trying to build out gpu clusters for deep learning so I believe going with Linux is the wisest choice here
They provide PyTorch packages built with ROCm. You can reach same goals with them. ZLUDA is meaningful when you are trying to run CUDA-only softwares.
Rocm is not widely adopted as Cuda, there are many libraries that do not officially work with Rocm like Cuda. Flash attention might work for example but not flash attention 2.
They provide PyTorch packages built with ROCm. You can reach same goals with them. ZLUDA is meaningful when you are trying to run CUDA-only softwares.
This is only for Linux. PyTorch does not work for Windows with ROCm currently. ZLUDA is particularly useful for Windows Users who have AMD Hardware. HIP/ROCm is only available up to 5.7.1 for windows (6.1 on Linux). PyTorch started support for ROCm with 6.
ComfyUI-Zluda https://github.com/patientx/ComfyUI-Zluda is making use of an older ZLUDA, but i am currently using your fork, which works generally quite well.
Is ZLUDA 3.8 compatible with 12.4 or 12.1 CUDA? I am running 11.8 CUDA Pytorch with ComfyUI via ZLUDA on a 7900XT.
I made it work by building PyTorch myself. The official CUDA release of PyTorch won't work.
I'm trying to setup pytorch and zluda to run cuda on amd gpus, but to no avail
This is the setup scrip I use, for setting up the OS and building pytorch
OS
ZLUDA and Pytorch