BerriAI / litellm

Python SDK, Proxy Server (LLM Gateway) to call 100+ LLM APIs in OpenAI format - [Bedrock, Azure, OpenAI, VertexAI, Cohere, Anthropic, Sagemaker, HuggingFace, Replicate, Groq]
https://docs.litellm.ai/docs/
Other
13.63k stars 1.6k forks source link

[Feature]: Move to azure cohere to use cohere format #2975

Open krrishdholakia opened 6 months ago

krrishdholakia commented 6 months ago

The Feature

Azure's openai spec doesn't have params like 'documents' which are supported by the cohere spec

for command-r https://learn.microsoft.com/en-us/azure/ai-studio/how-to/deploy-models-cohere-command#v1chatcompletions

Motivation, pitch

user was trying to pass azure cohere documents via litellm and faced this issue

Twitter / LinkedIn details

cc: @JungeAlexander

JungeAlexander commented 6 months ago

Thanks for opening this @krrishdholakia. I agree this is quite confusing. The following chat endpoint seems to have the documents parameter (for RAG) as well as tools and tool_result which are both supported by the Cohere Command R and R Plus models:

https://learn.microsoft.com/en-us/azure/ai-studio/how-to/deploy-models-cohere-command#v1chat