issues
search
EmbeddedLLM
/
vllm-rocm
vLLM: A high-throughput and memory-efficient inference and serving engine for LLMs
https://vllm.readthedocs.io
Apache License 2.0
83
stars
5
forks
source link
Update vLLM Documentations
#18
Closed
tjtanaa
closed
6 months ago
tjtanaa
commented
6 months ago
Update the vLLM installation procedures on AMD platform.
Update vLLM documentations.