BerriAI / litellm

Python SDK, Proxy Server (LLM Gateway) to call 100+ LLM APIs in OpenAI format - [Bedrock, Azure, OpenAI, VertexAI, Cohere, Anthropic, Sagemaker, HuggingFace, Replicate, Groq]
https://docs.litellm.ai/docs/
Other
13.4k stars 1.57k forks source link

[Bug]: `OpenAI Embedding` does not support `modality` parameter in `extra body. #6525

Open S1LV3RJ1NX opened 4 hours ago

S1LV3RJ1NX commented 4 hours ago

What happened?

I am trying to hit an embedding API via AsyncOpenAI. The following code works:

 client = AsyncOpenAI(base_url="http://localhost:7997", api_key="sk-infinity-svc")
 response = await client.embeddings.create(
    model="michaelfeil/colpali-v12-random-testing",
    input=[base64_image],
    encoding_format="float",
    extra_body={
        "modality": "image"
    }
)
print(response)

But when I try the same with litellm (assume litellm intialized with necessary url and key),

response = await self.client.embeddings.create(
            input=input, 
            model=embedding_model, 
            encoding_format="float", 
            extra_body={"modality": "image"}
        )

I get the following error: litellm.llms.OpenAI.openai.OpenAIError: AsyncEmbeddings.create() got an unexpected keyword argument modality

config file:

- model_name: colpali-random
    litellm_params:
      model: openai/michaelfeil/colpali-v12-random-testing
      api_base: http://infinity:7997
      api_key: "sk-infinity-svc"
    model_info:
      type: embedding

Relevant log output

An error occurred: litellm.APIError: APIError: OpenAIException - AsyncEmbeddings.create() got an unexpected keyword argument 'modality'
Received Model Group=colpali-random
Available Model Group Fallbacks=None, LiteLLM Max Retries: 1 
Model: openai/michaelfeil/colpali-v12-random-testing
API Base: `http://infinity:7997`
model_group: `colpali-random`

deployment: `openai/michaelfeil/colpali-v12-random-testing`

Twitter / LinkedIn details

No response

S1LV3RJ1NX commented 3 hours ago

This was solved by updating the yaml

- model_name: colpali-random
    litellm_params:
      model: openai/michaelfeil/colpali-v12-random-testing
      api_base: http://infinity:7997
      api_key: "sk-*****"
      extra_body: { "modality": "image" }
    model_info:
      type: embedding
S1LV3RJ1NX commented 2 hours ago

But I cannot update the modality param in runtime. :(

For now I am making two entries in yaml

- model_name: colpali-random
    litellm_params:
      model: openai/michaelfeil/colpali-v12-random-testing
      api_base: http://infinity:7997
      api_key: "sk-*****"
      extra_body: { "modality": "image" }
    model_info:
      type: embedding

  - model_name: colpali-random-text
    litellm_params:
      model: openai/michaelfeil/colpali-v12-random-testing
      api_base: http://infinity:7997
      api_key: "sk-*****"
      extra_body: { "modality": "text" }
    model_info:
      type: embedding