Open benahmedadel opened 7 months ago
that's weird. why is an async generator being returned for a non streaming object.
Which provider is this? @benahmedadel
@krrishdholakia It is OLLAMA/mistral.
@benahmedadel
I am unable to reproduce this error. Could you check if its still occurs? Here is how I tried
Start the ollama locally
ollama run mistral
Start the proxy server
litellm --model ollama/mistral
Send request This request is sent from windows powershell
$headers = @{
"Authorization" = "Bearer sk-123"
}
Invoke-WebRequest -Uri "http://localhost:4000/v1/chat/completions" `
-Method Post `
-Headers $headers `
-ContentType "application/json" `
-Body "{ `"model`": `"mistral`", `"messages`": [ { `"role`": `"user`", `"content`": `"what llm are you`" } ] }"
What happened?
When i try to make curl to chat/completions, i have errors
curl --location 'http://0.0.0.0:8080/v1/chat/completions' --header 'Authorization: Bearer sk-1234' --header 'Content-Type: application/json' --data '{ "model": "mistral", "messages": [ { "role": "user", "content": "what llm are you" } ] }'
gives (for both litellm 1.34.4 and 1.33.7)
_{"error":{"message":"cannot pickle 'asyncgenerator' object","type":"None","param":"None","code":500}}
Relevant log output
Twitter / LinkedIn details
No response