Closed StanGirard closed 9 months ago
This is also happening when using a VertexAI model. In my case codechat-bison
.
Getting this bug too, running Mistral as well, trying to run ChatLiteLLM through an LLMChain.
ValueError: LiteLLM.Exception: Unsupported parameters passed: n
Hi @StanGirard @HallerPatrick @minh-v
This is occurring due to passing in args that aren't being translated to the provider.
What should the expected behaviour here be?
Looks like we need to add support for TGI 'n' param - doing so right now.
Fix pushed - 512769e84105a24fd7118ba18b71805ab94c28d3
Will update ticket, once solution is in prod
also created a ticket to prevent future issues like this - https://github.com/BerriAI/litellm/issues/517
Please add any comments / ideas.
fix pushed in v0.1.816
can you check and confirm if this works for you @StanGirard / @HallerPatrick / @minh-v
No, getting the same error:
in _check_valid_arg
raise ValueError("LiteLLM.Exception: Unsupported parameters passed: {}".format(', '.join(unsupported_params)))
ValueError: LiteLLM.Exception: Unsupported parameters passed: n
The error for Mistral is fixed if you add those same changes to the together_ai
section of the fix.
elif custom_llm_provider == "together_ai":
## check if unsupported param passed in
supported_params = ["stream", "temperature", "max_tokens", "top_p", "stop", "frequency_penalty", "n"]
_check_valid_arg(supported_params=supported_params)
if stream:
optional_params["stream_tokens"] = stream
if temperature:
optional_params["temperature"] = temperature
if top_p:
optional_params["top_p"] = top_p
if max_tokens:
optional_params["max_tokens"] = max_tokens
if n:
optional_params["best_of"] = n
@minh-v looks like together ai doesn't support the n
token https://docs.together.ai/reference/inference
I know langchain sends n=1 by default. Are you setting this param for together ai?
if not, i can introduce a default value check
fix pushed - e834c063ff4a1632d9224976760a0cc35d7000b1
@minh-v will update once this is in prod.
I've pushed a patch to check if n=1 (langchain default), if so, it won't raise an issue.
closing as this is now resolved.
@minh-v re-open if v0.1.819
didn't solved your problem.
What happened?
A bug happened!
Relevant log output
Twitter / LinkedIn details
No response