BerriAI / litellm

Python SDK, Proxy Server (LLM Gateway) to call 100+ LLM APIs in OpenAI format - [Bedrock, Azure, OpenAI, VertexAI, Cohere, Anthropic, Sagemaker, HuggingFace, Replicate, Groq]
https://docs.litellm.ai/docs/
Other
13.9k stars 1.64k forks source link

[Feature]: support content list for azure ai #4237

Closed krrishdholakia closed 1 month ago

krrishdholakia commented 5 months ago

The Feature

Azure ai supports content as a string. Openai supports content as a list.

This raises errors when passing straight through

Motivation, pitch

Streamlined way to call vision and non-vision models would be great. Being LLM-agnostic is a big reason why I use the package but currently still have to handle different request format depending on which model it goes to.

For example: Calling GPT4 Vision, messages.content is an array. Using the same code to call Azure's Command R+ would result in

litellm.exceptions.APIError: OpenAIException - Error code: 400 - {'message': 'invalid type: parameter messages.content is of type array but should be of type string.'} I'm aware this is on the model provider's side, but GPT's non-vision models for example support both format.

Twitter / LinkedIn details

cc: @ducnvu

ishaan-jaff commented 1 month ago

Hi @ducnvu, thanks for using LiteLLM. Any chance we can hop on a call to learn how we can improve LiteLLM Proxy for you ?

We’re planning roadmap and I’d love to get your feedback

my cal for your convenience: https://calendly.com/d/4mp-gd3-k5k/litellm-1-1-onboarding-chat my linkedin if you prefer DMs: https://www.linkedin.com/in/reffajnaahsi/