BerriAI / litellm

Python SDK, Proxy Server (LLM Gateway) to call 100+ LLM APIs in OpenAI format - [Bedrock, Azure, OpenAI, VertexAI, Cohere, Anthropic, Sagemaker, HuggingFace, Replicate, Groq]
https://docs.litellm.ai/docs/
Other
12.82k stars 1.5k forks source link

[Feature]: Astra Assistants Support #4641

Closed phact closed 3 months ago

phact commented 3 months ago

The Feature

Currently litellm only supports openai as the only Assistants API provider. This PR: https://github.com/BerriAI/litellm/pull/4118 adds support for astra-assistants which makes it so users can leverage assistants api with multiple LLMs and embedding providers.

Motivation, pitch

Folks have been asking for litellm assistants support for other [non-openai] models https://github.com/BerriAI/litellm/issues/2842#issuecomment-2114941184

Twitter / LinkedIn details

https://www.linkedin.com/in/sestevez/ syllogistic

cheddarking commented 3 months ago

This is very useful for my project. Please implement. Thanks!

krrishdholakia commented 3 months ago

Astra Assistants looks openai compatible.

This should already be supported by just adding openai/ to the model name

config

assistant_settings:
  custom_llm_provider: openai
  litellm_params: 
    api_key: os.environ/ASTRA_API_KEY
    api_base: os.environ/ASTRA_API_BASE

curl

curl -X POST "http://localhost:4000/v1/assistants" \
  -H "Content-Type: application/json" \
  -H "Authorization: Bearer sk-1234" \
  -d '{
    "instructions": "You are a personal math tutor. When asked a question, write and run Python code to answer the question.",
    "name": "Math Tutor",
    "tools": [{"type": "code_interpreter"}],
    "model": "openai/<my-astra-model-name>"
  }'