RuolinZheng08 / twewy-discord-chatbot

Discord AI Chatbot using DialoGPT, trained on the game transcript of The World Ends With You
https://www.freecodecamp.org/news/discord-ai-chatbot/
MIT License
317 stars 156 forks source link

How to use amd gpu? #5

Open crackedpotato007 opened 3 years ago

crackedpotato007 commented 3 years ago

Hey, i own a amd gpu and look forward to make models but i cant as it needs nvidia drivers it is there any workarounds

capps1994 commented 3 years ago

Most ML libraries/frameworks use CUDA which is an Nvidia technology so it's generally unsupported for AMD GPU's to my knowledge, am I sure there could be a workaround somewhere but I am unsure. The standard for machine learning right now is Nvidia. Sorry to disappoint, however Google Colaboratory lets you run the code with GPU acceleration

Hurri08 commented 1 year ago

I know this is pretty old, but I figured I still answer for someone finds their way here. To train AI-Models with AMD GPUs you first need ROCm: https://docs.amd.com/bundle/ROCm-Installation-Guide-v5.5/page/Introduction_to_ROCm_Installation_Guide_for_Linux.html As the URL suggest, you need a supported Linux-Distribution and a fitting pytorch version for ROCm. ROCm supports the cuda nomenclature and therefore doesn't need any changes in the code as pytorch/ROCm just "translate" it. After all this is installed properly, you can start training on your AMD-GPU.

With version 5.6 there is supposedly Windows support coming.