Open satyamk7054 opened 7 months ago
Hi @satyamk7054, we are working on releasing an official vLLM docker image for ROCm. Please stay tuned and use our docker file (Dockerfile.rocm
) to build your container for now.
Hi @WoosukKwon , thank you for your response.
Are there any plans to provide a pre-compiled artifact as well?
Hi @satyamk7054, we are working on releasing an official vLLM docker image for ROCm. Please stay tuned and use our docker file (
Dockerfile.rocm
) to build your container for now.您好,我们正在努力发布 ROCm 的官方 vLLM docker 镜像。请继续关注并使用我们的 docker 文件 (Dockerfile.rocm
) 来构建您的容器。
+1
This issue has been automatically marked as stale because it has not had any activity within 90 days. It will be automatically closed if no further activity occurs within 30 days. Leave a comment if you feel this issue should remain open. Thank you!
Hi, are there any plans to provide vLLM releases that are pre-compiled for AMD GPUs?
How you are installing vllm
https://docs.vllm.ai/en/latest/getting_started/amd-installation.html