Closed krrishdholakia closed 1 year ago
when a non openai / azure model is passed in, an async client initialization error happens.
llm-proxy:dev: An error occurred: 'stream_async_client'
No response
@ishaan-jaff i know you're working on this right now - just update ticket once this is resolved.
it's fixed - waiting on deploying a new version of litellm
fixed + added testing on router for this
What happened?
when a non openai / azure model is passed in, an async client initialization error happens.
Relevant log output
Twitter / LinkedIn details
No response