BerriAI / litellm

Python SDK, Proxy Server (LLM Gateway) to call 100+ LLM APIs in OpenAI format - [Bedrock, Azure, OpenAI, VertexAI, Cohere, Anthropic, Sagemaker, HuggingFace, Replicate, Groq]
https://docs.litellm.ai/docs/
Other
13.47k stars 1.58k forks source link

[Bug]: curl chat completions generate TypeError: cannot pickle 'async_generator' object #2698

Open benahmedadel opened 7 months ago

benahmedadel commented 7 months ago

What happened?

When i try to make curl to chat/completions, i have errors curl --location 'http://0.0.0.0:8080/v1/chat/completions' --header 'Authorization: Bearer sk-1234' --header 'Content-Type: application/json' --data '{ "model": "mistral", "messages": [ { "role": "user", "content": "what llm are you" } ] }'

gives (for both litellm 1.34.4 and 1.33.7)

_{"error":{"message":"cannot pickle 'asyncgenerator' object","type":"None","param":"None","code":500}}

Relevant log output

Traceback (most recent call last):
  File "/litellm/venv/lib/python3.10/site-packages/litellm/proxy/proxy_server.py", line 3219, in chat_completion
    response = await proxy_logging_obj.post_call_success_hook(
  File "/litellm/venv/lib/python3.10/site-packages/litellm/proxy/utils.py", line 449, in post_call_success_hook
    new_response = copy.deepcopy(response)
  File "/usr/lib/python3.10/copy.py", line 161, in deepcopy
    rv = reductor(4)
TypeError: cannot pickle 'async_generator' object

Twitter / LinkedIn details

No response

krrishdholakia commented 7 months ago

that's weird. why is an async generator being returned for a non streaming object.

Which provider is this? @benahmedadel

benahmedadel commented 7 months ago

@krrishdholakia It is OLLAMA/mistral.

edwinjosegeorge commented 4 months ago

@benahmedadel

I am unable to reproduce this error. Could you check if its still occurs? Here is how I tried

  1. Start the ollama locally ollama run mistral

  2. Start the proxy server litellm --model ollama/mistral

  3. Send request This request is sent from windows powershell

    $headers = @{
    "Authorization" = "Bearer sk-123"
    }
    Invoke-WebRequest -Uri "http://localhost:4000/v1/chat/completions" `
    -Method Post `
    -Headers $headers `
    -ContentType "application/json" `
    -Body "{ `"model`": `"mistral`", `"messages`": [ { `"role`": `"user`", `"content`": `"what llm are you`" } ] }"