bentoml / OpenLLM

Run any open-source LLMs, such as Llama 3.1, Gemma, as OpenAI compatible API endpoint in the cloud.
https://bentoml.com
Apache License 2.0
9.4k stars 598 forks source link

Availability of the OpenAI /v1/completions API Endpoint ? #1015

Closed vjeantet closed 1 week ago

vjeantet commented 1 month ago

Hello,

I noticed in the code examples that the OpenAI completions endpoint is implemented. Could you inform me when it will be available via the API ?

As of today, the Swagger documentation shows "v1/chat/completions" but not "v1/completions," and attempts to access the latter result in a 404 error page.

Thank you for your efforts, and keep up the good work!

Valere

aarnphm commented 1 month ago

Ye I can port the implementation over.

Lexazan commented 1 month ago

+1 for v1/completions support

bojiang commented 1 week ago

supported by openllm v0.6.