Closed neubig closed 1 week ago
UserWarning: Pydantic serializer warnings:
this looks like a warning @neubig
Ah, got it. Actually maybe it didn't error.
could you print response to confirm? i can close the issue if it ends up being a no-op
I confirmed. But having no warning would be preferable I guess?
@ishaan-jaff do you have a fix for this in your pr? https://github.com/BerriAI/litellm/pull/5666
@krrishdholakia no - but tracking this issue on our O-1 master list: https://github.com/BerriAI/litellm/issues/5672
working through this issue as part of it
@ishaan-jaff I'm also getting this warning using "gpt-4o-mini"
@ishaan-jaff I'm also getting this warning using "gpt-4o-mini"
same here
+1
Getting this same with litellm==1.46.1
:
/path/to/.venv/lib/python3.12/site-packages/pydantic/main.py:390: UserWarning: Pydantic serializer warnings:
Expected `CompletionTokensDetails` but got `dict` with value `{'reasoning_tokens': 0}` - serialized value may not be as expected
return self.__pydantic_serializer__.to_python(
Looks like LiteLLM is not properly handling reasoning_tokens
yet :/
Not just o1-mini, but all OpenAI models:
gpt-3.5-turbo
In [18]: model = "gpt-3.5-turbo"
In [19]: resp = litellm.completion(
...: model=model,
...: num_retries=5,
...: messages=[
...: {
...: "role": "system", "content" :"hello"
...: }
...: ]
...: )
/Users/reibs/anaconda3/lib/python3.11/site-packages/pydantic/main.py:390: UserWarning: Pydantic serializer warnings:
Expected CompletionTokensDetails
but got dict
with value {'reasoning_tokens': 0}
- serialized value may not be as expected
return self.__pydantic_serializer__.to_python(
Groq, for example, doesn't cause this:
In [23]: model = "groq/llama-3.1-70b-versatile"
In [24]: resp = litellm.completion( ...: model=model, ...: num_retries=5, ...: messages=[ ...: { ...: "role": "system", "content" :"hello" ...: } ...: ] ...: )
In [25]:
Anthropic:
In [29]: model = "claude-3-sonnet-20240229"
In [30]: resp = litellm.completion( ...: model=model, ...: num_retries=5, ...: messages=[ ...: { ...: "role": "user", "content" :"hello" ...: } ...: ] ...: )
hi everyone looking into this right now - sorry for the delay
fixed here: https://github.com/BerriAI/litellm/pull/5754
What happened?
I'm using the litellm proxy, and when calling
o1-mini
it returns an error.Results in
Relevant log output
No response
Twitter / LinkedIn details
No response