BerriAI / litellm

Python SDK, Proxy Server (LLM Gateway) to call 100+ LLM APIs in OpenAI format - [Bedrock, Azure, OpenAI, VertexAI, Cohere, Anthropic, Sagemaker, HuggingFace, Replicate, Groq]
https://docs.litellm.ai/docs/
Other
14.26k stars 1.68k forks source link

[Bug]: `OpenAI Embedding` does not support `modality` parameter in `extra body. #6525

Open S1LV3RJ1NX opened 3 weeks ago

S1LV3RJ1NX commented 3 weeks ago

What happened?

I am trying to hit an embedding API via AsyncOpenAI. The following code works:

 client = AsyncOpenAI(base_url="http://localhost:7997", api_key="sk-infinity-svc")
 response = await client.embeddings.create(
    model="michaelfeil/colpali-v12-random-testing",
    input=[base64_image],
    encoding_format="float",
    extra_body={
        "modality": "image"
    }
)
print(response)

But when I try the same with litellm (assume litellm intialized with necessary url and key),

response = await self.client.embeddings.create(
            input=input, 
            model=embedding_model, 
            encoding_format="float", 
            extra_body={"modality": "image"}
        )

I get the following error: litellm.llms.OpenAI.openai.OpenAIError: AsyncEmbeddings.create() got an unexpected keyword argument modality

config file:

- model_name: colpali-random
    litellm_params:
      model: openai/michaelfeil/colpali-v12-random-testing
      api_base: http://infinity:7997
      api_key: "sk-infinity-svc"
    model_info:
      type: embedding

Relevant log output

An error occurred: litellm.APIError: APIError: OpenAIException - AsyncEmbeddings.create() got an unexpected keyword argument 'modality'
Received Model Group=colpali-random
Available Model Group Fallbacks=None, LiteLLM Max Retries: 1 
Model: openai/michaelfeil/colpali-v12-random-testing
API Base: `http://infinity:7997`
model_group: `colpali-random`

deployment: `openai/michaelfeil/colpali-v12-random-testing`

Twitter / LinkedIn details

No response

S1LV3RJ1NX commented 3 weeks ago

This was solved by updating the yaml

- model_name: colpali-random
    litellm_params:
      model: openai/michaelfeil/colpali-v12-random-testing
      api_base: http://infinity:7997
      api_key: "sk-*****"
      extra_body: { "modality": "image" }
    model_info:
      type: embedding
S1LV3RJ1NX commented 3 weeks ago

But I cannot update the modality param in runtime. :(

For now I am making two entries in yaml

- model_name: colpali-random
    litellm_params:
      model: openai/michaelfeil/colpali-v12-random-testing
      api_base: http://infinity:7997
      api_key: "sk-*****"
      extra_body: { "modality": "image" }
    model_info:
      type: embedding

  - model_name: colpali-random-text
    litellm_params:
      model: openai/michaelfeil/colpali-v12-random-testing
      api_base: http://infinity:7997
      api_key: "sk-*****"
      extra_body: { "modality": "text" }
    model_info:
      type: embedding
michaelfeil commented 3 weeks ago

@S1LV3RJ1NX I implemented modality similar to e.g. guidance is implemented with extra_kwargs in vllm.

now the question would be, if LiteLLM has implanted that for vllm. If not -> the issue would be pressing for repo maintenaner if yes -> implement it the same way? https://docs.vllm.ai/en/v0.5.1/serving/openai_compatible_server.html

S1LV3RJ1NX commented 3 weeks ago

Yes, extra body seems to be an issue for now, and only way is to add multiple models in yaml. Not sure, if repo maintainer has anyother ideas.