run-llama / llama_index

LlamaIndex is a data framework for your LLM applications
https://docs.llamaindex.ai
MIT License
35.84k stars 5.08k forks source link

[Question]: Is log probability available in llm.achat? #13544

Closed mw19930312 closed 1 month ago

mw19930312 commented 4 months ago

Question Validation

Question

I wonder whether log prob can be retrieved through llm.achat (see https://platform.openai.com/docs/api-reference/chat#chat-create-logprobs) for log prob in OpenAI API. If so, is there an example code?

dosubot[bot] commented 4 months ago

Hey @mw19930312, great to see you back! 😊 I'm on it - will get back to you with an answer shortly!

logan-markewich commented 4 months ago

@mw19930312 it already is

llm = OpenAI(..., logprobs=True, top_logprobs=20)

response = await llm.achat([ChatMessage(role='user', content='Hello!')])

print(response.logprobs)

20 is the highest openai lets you select.

dosubot[bot] commented 4 months ago

Yes, log probability can indeed be retrieved through llm.achat by setting logprobs=True and specifying top_logprobs=20 as shown in the example code provided. This configuration enables the retrieval of log probabilities for the top 20 tokens. The response.logprobs will contain the log probabilities after making a request.

#### Details **About Dosu** This response is meant to be useful and save you time. It is **not meant to be a precise solution**, but rather a starting point for your own research. Help me be more useful! Please leave a 👍 if this is helpful and 👎 if it is irrelevant.

To continue the conversation, mention @dosu.