Closed uripeled2 closed 11 months ago
Add p-top (here is OpenAI docs on p-top) as a param in BaseLLMAPIClient.text_completion. You can take a look at this great PR done by @EyalPaz-700 and see how he added max_tokens and temperature into BaseLLMAPIClient.text_completion params
https://github.com/uripeled2/llm-client-sdk/pull/29
Add p-top (here is OpenAI docs on p-top) as a param in BaseLLMAPIClient.text_completion. You can take a look at this great PR done by @EyalPaz-700 and see how he added max_tokens and temperature into BaseLLMAPIClient.text_completion params