Closed krrishdholakia closed 12 months ago
Assistants have a lot going on.
At a high level, a typical integration of the Assistants API has the following flow:
Hi Krish apparently it could be great if LiteLLM would support an abstraction layer on these OpenAI API endpoints, as you did for completions API endpoints! Nevertheless currently the "Assistant" architecture (build upon Threads, Messages, Runs, etc.) is in my modest opinion just a very DRAFT opinatable framework. I'm not very sure is the very best architecture to solve the functional requirement. I'm not sure other vendors or open-source project will follow the same approach.
So my modest suggestion is to depreoritize the support for OpenAI Assistant APIs. Whereas I'd prioritize support for the latest OpenAI version, see https://github.com/BerriAI/litellm/issues/799, because this last issue imjpacts not only OpenAI native model provider but also Azure OpenAI deployments.
Just an idea. Thanks giorgio
agreed @solyarisoftware
we have a dev release out v1.0.0.dev1
and are tracking this issue here - https://github.com/BerriAI/litellm/issues/774
+1 on OpenAI Assistants API support
@slavakurilyak why use litellm to call the assistants api? would help to understand the use-case
Our drop in replacement for Assistants API uses LiteLLM to support multiple LLM and embedding models https://github.com/datastax/astra-assistants-api Not sure if that's what you're looking for @slavakurilyak
The Feature
https://platform.openai.com/docs/api-reference/assistants
Motivation, pitch
openai-compatibility
Twitter / LinkedIn details
No response