abetlen / llama-cpp-python

Python bindings for llama.cpp
https://llama-cpp-python.readthedocs.io
MIT License
8.16k stars 970 forks source link

server: chat completions returns wrong logprobs model #1787

Open domdomegg opened 1 month ago

domdomegg commented 1 month ago

Prerequisites

Please answer the following questions for yourself before submitting an issue.

Expected Behavior

The OpenAI Compatible Server to match the response structure of the OpenAI API for chat completion with logprobs.

This means returning an object with 'content' and 'refusal' keys, each with logprob information.

Reference: https://platform.openai.com/docs/api-reference/chat/object

Current Behavior

The server returns the logprobs format for the legacy completions API, rather that the chat completions API, when calling the chat completions API.

Environment and Context

Please provide detailed information about your computer setup. This is important in case the issue is not reproducible except for under certain specific conditions.

$ lscpu

$ uname -a

$ python3 --version
$ make --version
$ g++ --version
domdomegg commented 1 month ago

I've put up a fix here: https://github.com/abetlen/llama-cpp-python/pull/1788