Closed phact closed 3 months ago
This is very useful for my project. Please implement. Thanks!
Astra Assistants looks openai compatible.
This should already be supported by just adding openai/
to the model name
config
assistant_settings:
custom_llm_provider: openai
litellm_params:
api_key: os.environ/ASTRA_API_KEY
api_base: os.environ/ASTRA_API_BASE
curl
curl -X POST "http://localhost:4000/v1/assistants" \
-H "Content-Type: application/json" \
-H "Authorization: Bearer sk-1234" \
-d '{
"instructions": "You are a personal math tutor. When asked a question, write and run Python code to answer the question.",
"name": "Math Tutor",
"tools": [{"type": "code_interpreter"}],
"model": "openai/<my-astra-model-name>"
}'
The Feature
Currently litellm only supports openai as the only Assistants API provider. This PR: https://github.com/BerriAI/litellm/pull/4118 adds support for astra-assistants which makes it so users can leverage assistants api with multiple LLMs and embedding providers.
Motivation, pitch
Folks have been asking for litellm assistants support for other [non-openai] models https://github.com/BerriAI/litellm/issues/2842#issuecomment-2114941184
Twitter / LinkedIn details
https://www.linkedin.com/in/sestevez/ syllogistic