vllm-project / vllm

A high-throughput and memory-efficient inference and serving engine for LLMs
https://docs.vllm.ai
Apache License 2.0
29.45k stars 4.42k forks source link

who can help me install vllm (at least 0.4.2) in mi50 8-gpus machine. [Installation]: #6539

Open linchen111 opened 3 months ago

linchen111 commented 3 months ago

Your current environment

who can help me install vllm (at least 0.4.2) in mi50 8-gpus machine. I'll pay for your time

How you are installing vllm

No response

github-actions[bot] commented 1 week ago

This issue has been automatically marked as stale because it has not had any activity within 90 days. It will be automatically closed if no further activity occurs within 30 days. Leave a comment if you feel this issue should remain open. Thank you!