BerriAI / litellm

Python SDK, Proxy Server (LLM Gateway) to call 100+ LLM APIs in OpenAI format - [Bedrock, Azure, OpenAI, VertexAI, Cohere, Anthropic, Sagemaker, HuggingFace, Replicate, Groq]
https://docs.litellm.ai/docs/
Other
12.32k stars 1.43k forks source link

[Bug]: #5811

Closed agamm closed 2 hours ago

agamm commented 2 hours ago

What happened?

A warning for no reason:

UserWarning: Pydantic serializer warnings

Example code:

import litellm
litellm.drop_params = True
 prompt_init = "Output 3 words:"
    response_init = (
        litellm.completion(
            model="gpt-4o",
            messages=[{"role": "user", "content": prompt_init}],
            max_tokens=4096,
            temperature=0.0,
            seed=42,
        )
        .choices[0]
        .message.content
    )
    result = response_init
    print(f"Result: {result}")

Relevant log output

/<myproj>/.venv/lib/python3.10/site-packages/pydantic/main.py:387: UserWarning: Pydantic serializer warnings:
  Expected `CompletionTokensDetails` but got `dict` with value `{'reasoning_tokens': 0}` - serialized value may not be as expected
  return self.__pydantic_serializer__.to_python(

Twitter / LinkedIn details

No response

ishaan-jaff commented 2 hours ago

@agamm - are you on latest version of litellm? this is fixed

agamm commented 2 hours ago

My bad! Indeed an old version - poetry didn't update it, so I had to manually. Thanks!