Closed jeromeroussin closed 4 months ago
acknowledging this issue @jeromeroussin
hoping to have a v0 out for testing by end of week
This would certainly be awesome, specifically looking for support for:
/openai/assistants
/openai/threads
/openai/threads/${thread-id}/messages
/openai/threads/${thread-id}/runs
/openai/threads/${thread-id}/runs/${run_id}
@krrishdholakia Any updates on this?
OpenAI Assistants API abstract away the complexity in handling stateful operations required for LLM-based apps including persistent threads, messages, and files.
This API unlocks automatic RAG pipelines so developers don't need to develop and manage their own vector-store infrastructure
Plus unifying data and LLMs within a single API is an underrated idea which saves developers time and ultimately money
hey @slavakurilyak @jeromeroussin @taralika we're hoping to have a v0 out for feedback by end of week.
Hi @taralika @jeromeroussin @slavakurilyak PR is now live. Aiming for the sdk support to be live today.
Is this something someone can give me feedback on next week? Help would be appreciated.
Next steps - adding azure + proxy support
Excited to see proxy support https://github.com/VRSEN/agency-swarm/issues/112
Hi @krrishdholakia, sorry for replying in a closed issue.
I noticed that the current implementation of Assistants API only allows one provider openai
:
Does this mean that we can only use OpenAI's models and cannot use other custom models such as Claude3, Llama3 or even Azure OpenAI's models for now?
If that's the case, is there any plan to support Assistants API for various models - like what LiteLLM did for chat/completion APIs - in the future?
Thanks!
I'll be adding a provider that supports other models once https://github.com/datastax/astra-assistants-api/issues/22 is complete
@RussellLuo how would you suggest we support litellm's completion calls within the assistants api framework? this seems pretty provider specific.
I think the next step would be adding the azure endpoints
@krrishdholakia Indeed, supporting the Assistants API requires a complete backend implementation. Looking forward to the support for Azure endpoints!
@phact Looks awesome, thanks for the great work!
The Feature
The Assistant API comes with new endpoints that liteLLM does not currently support. Docs:
We'd like to be able to proxy calls to the assistant API with litellm
Motivation, pitch
The assistant API is a quick way to provide "chat-with-your-doc" and codeInterpreter capabilities to an existing chatbot.
Twitter / LinkedIn details
https://www.linkedin.com/in/jeromeroussin/