Closed chymian closed 23 hours ago
@chymian i'm pretty sure your error is related to max_tokens
yup - can confirm this works without max_tokens
in the litellm_params
i'll work on making the error message more descriptive
I can confirm as well. thx @krrishdholakia
@krrishdholakia
Q: should max_tokens
not be dropped by drop_params: true
, which is set?
What happened?
using jina_ai via litelllm throw a 400, bad request.
litellm deployment: version 1.52.0 docker: ghcr.io/berriai/litellm-database (on coolify)
config:
test cmd:
throws errors, see below. using the same cmd pointing at
https://api.jina.ai/v1
works flawless.Relevant log output
Twitter / LinkedIn details
No response