Closed stronk7 closed 3 months ago
@stronk7 these look like prisma errors - raise prisma_errors.MissingRequiredValueError(error)
prisma.errors.MissingRequiredValueError: Unable to match input value to any allowed input type for the field. Parse errors: [Invalid argument type. data
should be of any of the following types: LiteLLM_SpendLogsCreateManyInput
, data.completion_tokens
: A value is required but not set]
i'll investigate this
Hi @krrishdholakia,
for the records I've tried here with openai text-embedding-3-small
and haven't been able to reproduce the problem, so it seems that it's ollama-related only.
One little detail that called my attention is that the openai embeddings request returns:
"usage": {
"prompt_tokens": 4,
"completion_tokens": 0,
"total_tokens": 4
}
but ollama, returns null
instead:
"usage": {
"prompt_tokens": 4,
"completion_tokens": null,
"total_tokens": 4
}
I don't know if that can be the cause of the prisma problem or no, but worth sharing.
Ciao :-)
Looks like this is caused. by completion_tokens being 'null' vs the expected 'int' type
What happened?
After adding a couple of ollama embeddings to the configuration to start playing with them, some errors immediately appear in the logs, after the 1st request, recurring, every few (10?) seconds.
I'm running a (just updated)
ghcr.io/berriai/litellm:main-latest
docker container.And this is the definition of the 2 embeddings in
config.yaml
:(that are using a local/host ollama instance, I imagine that it's not important)
The endpoints seem to work ok:
And I get (correct):
Note that I've not tried with embeddings from other providers, only with ollama.
Relevant log output
Twitter / LinkedIn details
No response