Open sorasoras opened 4 months ago
Sorry I skipped over this, apparently? My guess would be you have the ROCm version of PyTorch installed. Solution would be a venv with the CUDA version of PyTorch, probably.
Sorry I skipped over this, apparently? My guess would be you have the ROCm version of PyTorch installed. Solution would be a venv with the CUDA version of PyTorch, probably.
I have Rocm Windows installed but not the pytorch for windows Rocm since that does not exist yet.
I think that's the problem is that it keep finding rocm of windows and skip pytorch by cuda. how do I block exllama from finding rocm and just use cuda?
What should I do to forcing exllama2 to use cuda instead of detecting ROCM?