Open weichen-1988 opened 1 year ago
Mini GPT-4 can be utilised with AMD GPUs, yes. Through the ROCm (Radeon Open Compute) platform, AMD GPUs are supported by PyTorch, the deep learning framework that Mini GPT-4 is generally developed in.
To use Mini GPT-4 with AMD GPUs, you need to ensure that you have the necessary software and drivers installed. Here are the general steps to set up Mini GPT-4 with AMD GPU:
Install ROCm: Visit the ROCm website (https://rocmdocs.amd.com/en/latest/Installation_Guide/Installation-Guide.html) and follow the installation instructions specific to your operating system. This will install the ROCm platform and necessary drivers for AMD GPUs.
Install PyTorch with ROCm support: You can install PyTorch with ROCm support after installing ROCm. For system-specific installation instructions, consult the official PyTorch website (https://pytorch.org) or the ROCm PyTorch repository (https://github.com/ROCmSoftwarePlatform/pytorch).
Verify GPU availability: After installation, you may use the following command in a Python environment with PyTorch and ROCm to see if your AMD GPU is recognised and ready for training.
import torch
device = torch.device("cuda" if torch.cuda.is_available() else "cpu")
print(device)
Run Mini GPT-4: With the AMD GPU detected and available, you can run Mini GPT-4 code as you would with any other GPU. Make sure to set the device appropriately to utilize the GPU:
import torch
device = torch.device("cuda" if torch.cuda.is_available() else "cpu")
model = MiniGPT4().to(device)
#Replace MiniGPT4() with the actual Mini GPT-4 model class you are using.
It's important to keep in mind that different deep learning models may not perform or work with AMD GPUs in the same way. For the particular model and library versions you are using, make sure the required optimisations and compatibility checks are carried out.
For more thorough instructions and any other factors unique to your system, please consult the official PyTorch and ROCm manuals.After installation, you may use the following command in a Python environment with PyTorch and ROCm to see if your AMD GPU is recognised and ready for training:
Yes, AMD GPU can run it. But you may try 16-bit if 8-bit mode gave error. 16-bit needs more VRAM (about 16GB for 7B model). I made some videos for using AMD GPU to run LLM: https://youtu.be/UtcaO3zTCKQ