BerriAI / litellm

Call all LLM APIs using the OpenAI format. Use Bedrock, Azure, OpenAI, Cohere, Anthropic, Ollama, Sagemaker, HuggingFace, Replicate (100+ LLMs)
https://docs.litellm.ai/docs/
Other
10.05k stars 1.12k forks source link

[Feature]: support content list for azure ai #4237

Open krrishdholakia opened 1 week ago

krrishdholakia commented 1 week ago

The Feature

Azure ai supports content as a string. Openai supports content as a list.

This raises errors when passing straight through

Motivation, pitch

Streamlined way to call vision and non-vision models would be great. Being LLM-agnostic is a big reason why I use the package but currently still have to handle different request format depending on which model it goes to.

For example: Calling GPT4 Vision, messages.content is an array. Using the same code to call Azure's Command R+ would result in

litellm.exceptions.APIError: OpenAIException - Error code: 400 - {'message': 'invalid type: parameter messages.content is of type array but should be of type string.'} I'm aware this is on the model provider's side, but GPT's non-vision models for example support both format.

Twitter / LinkedIn details

cc: @ducnvu