Closed mw19930312 closed 1 month ago
Hey @mw19930312, great to see you back! 😊 I'm on it - will get back to you with an answer shortly!
@mw19930312 it already is
llm = OpenAI(..., logprobs=True, top_logprobs=20)
response = await llm.achat([ChatMessage(role='user', content='Hello!')])
print(response.logprobs)
20 is the highest openai lets you select.
Yes, log probability can indeed be retrieved through llm.achat
by setting logprobs=True
and specifying top_logprobs=20
as shown in the example code provided. This configuration enables the retrieval of log probabilities for the top 20 tokens. The response.logprobs
will contain the log probabilities after making a request.
To continue the conversation, mention @dosu.
Question Validation
Question
I wonder whether log prob can be retrieved through llm.achat (see https://platform.openai.com/docs/api-reference/chat#chat-create-logprobs) for log prob in OpenAI API. If so, is there an example code?