Closed pchalasani closed 8 months ago
I guess this is related to #1539
@pchalasani just pushed the fix - https://github.com/BerriAI/litellm/commit/37de964da47362729842f58baf228ae74314f7ab
Should be live soon in v1.20.6
. Please re-open the issue if that doesn't fix your problem.
What happened?
Litellm 1.19.4
The
max_tokens
param is ignored. Looking in the code, it looks like theget_optional_params
fn doesn't pick it up, even thoughmax_tokens
is passed. Examples below, withollama/llama2
,ollama/mistral
,ollama_chat/mistral
.Relevant log output
Twitter / LinkedIn details
No response