-
-
### What you would like to be added?
Support ROCm PyTorch distributed training runtime.
### Why is this needed?
PyTorch has been advertising support for AMD ROCm and AMD Instinct and Radeon GPUs si…
-
Hi, I was wondering if the FlexAttention API is supported on AMD GPUs
-
### 🐛 Describe the bug
I'm trying to test this library on an HPC cluster with AMD MI250X GPUs, but I'm getting a weird seemingly Triton-related error specifically when I turn on `model.train()`. Th…
-
```[tasklist]
### Tasks
- [x] HIP Codegen Support
```
-
### Your current environment
I am trying out FP8 support on AMD GPUs (MI250, MI300) and the vLLM library does not seem to support AMD GPUs yet for FP8 quantization. Is there any timeline for when thi…
-
**Describe the bug**
I have a AMD Radeon RX 6800 XT. Stable diffusion supports this GPU.
After building this image, it fails to run:
```
=> => naming to docker.io/library/webui-docker-autom…
-
### Your current environment
Hi,
I am attempting to run the [vLLM ROCm image](https://hub.docker.com/r/rocm/vllm-ci/tags) on a Kubernetes cluster. The AMD GPU is successfully detected, and the A…
-
Hello! Having studied the documentation provided, I still could not understand whether there is support for GGUF quantized models on AMD GPU. I would like to use the Q8 or even Q4 model based on Mistr…
-
### Describe the bug
The status plugin for GPU usage doesn't process AMD and intel GPUs
### To Reproduce
Steps to reproduce the behavior:
1. Have something other than an Nvidia GPU
2. Add `gpu-…