vosen / ZLUDA

CUDA on ??? GPUs
Apache License 2.0
8.96k stars 596 forks source link

ubuntu22.04 ,AMD gpu,pytorch build,NCCL #130

Open liuyang6055 opened 7 months ago

liuyang6055 commented 7 months ago

hello, << export TORCH_CUDA_ARCH_LIST="6.1+PTX" export CUDAARCHS=61 export CMAKE_CUDA_ARCHITECTURES=61 export USE_SYSTEM_NCCL=1 export NCCL_ROOT_DIR=/usr export NCCL_INCLUDE_DIR=/usr/include export NCCL_LIB_DIR=/usr/lib/x86_64-linux-gnu export USE_EXPERIMENTAL_CUDNN_V8_API=OFF

I would like to ask a PyTorch question,I used Ubuntu 22.04, AMD GPU, ZLUDA, and I found that the compilation of pytorch did not use zluda/target/release. So how does zluda work? These exports are not in contact with zluda

The model I am using is fairseq.

Simmer1234569 commented 7 months ago

If you’re using Ubuntu, just download the ROCM version of pytorch.

ricperry commented 7 months ago

ROCm version because?? Is it faster than ZLUDA? There have been numerous compatibility issues with pytorch-rocm so that if we could use the standard version, we might be able to overcome those compat issues.

liuyang6055 commented 7 months ago

We only have AMD's hardware environment, but we would like to use CUDA and NCCL. So we want to know if zluda can achieve this effect. We don't want to use ROCM

Simmer1234569 commented 7 months ago

PyTorch-rocm is the same as PyTorch cuda on the surface. The ZLUDA integration of PyTorch doesn’t work with everything because Cudnn doesn’t work with PyTorch. On Feb 25, 2024, at 19:14, liuyang6055 @.***> wrote: We only have AMD's hardware environment, but we would like to use CUDA and NCCL. So we want to know if zluda can achieve this effect. We don't want to use ROCM

—Reply to this email directly, view it on GitHub, or unsubscribe.You are receiving this because you commented.Message ID: @.***>

vosen commented 7 months ago

ZLUDA uses ROCm, so it's not like you can avoid ROCm entirely with ZLUDA. Back to original question, PyTorch uses ZLUDA just like very other application: ZLUDA poses as a (special) CUDA implementation. Makes no difference for the app

thenightterorx commented 6 months ago

ZLUDA uses ROCm, so it's not like you can avoid ROCm entirely with ZLUDA. Back to original question, PyTorch uses ZLUDA just like very other application: ZLUDA poses as a (special) CUDA implementation. Makes no difference for the app

Would ZLUDA allow for the use of something like xformers to speed up some ai programs?