Closed zhangzhen-research closed 11 months ago
Sorry for the late reply! This question is related to the other issue.
Completion models (such as text-davinci-003
which is the model used in the paper) returns probabilities of generated tokens if you specify the logprobs parameter in the request.
However, chat models (such as gpt-3.5-turbo
) cannot return probabilities.
How do you get token probability through openai's api