Open slaeves2 opened 11 months ago
Hi @slaeves2,
For Radeon GPUs you can use ROCm with PyTorch, a very good documentation about the installation can be found here Installing PyTorch for ROCm. It works very well on my Linux system (I don't know if it works on Windows).
Cheers Martin
The following offline models have tested with radeon gpu:
mbart50 -> Translation for the rubbish bin m2m100 -> ctranslate2 don't work with radeon m2m100_big -> ctranslate2 don't work with radeon nllb -> work's nllb_big -> work's
What would your feature do?
How i understand now there is no method to use my gpu if it no nvidia, i have radeon vulkan driver. How i know for radeon users was created vulkan supporting version of waifu, exist any guide or merthod to avoid using cuda and use radeon videocards for this program? I apologise if i used wrong thread for it, but i didn't find any answers for my question.