Closed vkehfdl1 closed 2 months ago
part of the #63
LlamaIndex openai model does not support logprob and output tokens. We might need to support it for using various log-prob based modules.
You can see g_eval method how to get logprob from openai API call.
g_eval
Don't have to get log prob from now.
Third, no error because of the side-effect. Dependency only in opneai-python library.
opneai-python
part of the #63
LlamaIndex openai model does not support logprob and output tokens. We might need to support it for using various log-prob based modules.