Closed yourbuddyconner closed 1 month ago
I have some "legacy" code that still uses the openai.completion.create endpoint. It makes a POST request to /v1/completions endpoint.
openai.completion.create
/v1/completions
llamafile supports a /completion endpoint which I assume is compatible however is out of spec with the openai SDK by default.
/completion
Probably an easy fix!
If anyone runs across this -- it's because of the dependency on llama.cpp which only has a chat completions endpoint.
I have some "legacy" code that still uses the
openai.completion.create
endpoint. It makes a POST request to/v1/completions
endpoint.llamafile supports a
/completion
endpoint which I assume is compatible however is out of spec with the openai SDK by default.Probably an easy fix!