BerriAI / litellm

Python SDK, Proxy Server (LLM Gateway) to call 100+ LLM APIs in OpenAI format - [Bedrock, Azure, OpenAI, VertexAI, Cohere, Anthropic, Sagemaker, HuggingFace, Replicate, Groq]
https://docs.litellm.ai/docs/
Other
14.61k stars 1.71k forks source link

[Bug]: stream async client bug on proxy #1012

Closed krrishdholakia closed 1 year ago

krrishdholakia commented 1 year ago

What happened?

when a non openai / azure model is passed in, an async client initialization error happens.

Relevant log output

llm-proxy:dev: An error occurred: 'stream_async_client'

Twitter / LinkedIn details

No response

krrishdholakia commented 1 year ago

@ishaan-jaff i know you're working on this right now - just update ticket once this is resolved.

ishaan-jaff commented 1 year ago

it's fixed - waiting on deploying a new version of litellm

ishaan-jaff commented 1 year ago

fixed + added testing on router for this