OpenCSGs / llm-inference

llm-inference is a platform for publishing and managing llm inference, providing a wide range of out-of-the-box features for model deployment, such as UI, RESTful API, auto-scaling, computing resource management, monitoring, and more.
Apache License 2.0
69 stars 17 forks source link

update quickstart.md to remove evaluate #101

Closed wanggxa closed 6 months ago

wanggxa commented 6 months ago

fix: https://github.com/OpenCSGs/llm-inference/issues/100

jasonhe258 commented 6 months ago

Good catch! Thanks for your contribution! Looks good to me!