issues
search
OpenCSGs
/
llm-inference
llm-inference is a platform for publishing and managing llm inference, providing a wide range of out-of-the-box features for model deployment, such as UI, RESTful API, auto-scaling, computing resource management, monitoring, and more.
Apache License 2.0
69
stars
17
forks
source link
upgrade vllm to v0.4.1
#143
Closed
depenglee1707
closed
5 months ago
depenglee1707
commented
5 months ago
suffered for more one week -_-
suffered for more one week -_-