Open S1LV3RJ1NX opened 3 weeks ago
This was solved by updating the yaml
- model_name: colpali-random
litellm_params:
model: openai/michaelfeil/colpali-v12-random-testing
api_base: http://infinity:7997
api_key: "sk-*****"
extra_body: { "modality": "image" }
model_info:
type: embedding
But I cannot update the modality param in runtime. :(
For now I am making two entries in yaml
- model_name: colpali-random
litellm_params:
model: openai/michaelfeil/colpali-v12-random-testing
api_base: http://infinity:7997
api_key: "sk-*****"
extra_body: { "modality": "image" }
model_info:
type: embedding
- model_name: colpali-random-text
litellm_params:
model: openai/michaelfeil/colpali-v12-random-testing
api_base: http://infinity:7997
api_key: "sk-*****"
extra_body: { "modality": "text" }
model_info:
type: embedding
@S1LV3RJ1NX I implemented modality similar to e.g. guidance is implemented with extra_kwargs in vllm.
now the question would be, if LiteLLM has implanted that for vllm. If not -> the issue would be pressing for repo maintenaner if yes -> implement it the same way? https://docs.vllm.ai/en/v0.5.1/serving/openai_compatible_server.html
Yes, extra body seems to be an issue for now, and only way is to add multiple models in yaml. Not sure, if repo maintainer has anyother ideas.
What happened?
I am trying to hit an embedding API via AsyncOpenAI. The following code works:
But when I try the same with litellm (assume litellm intialized with necessary url and key),
I get the following error:
litellm.llms.OpenAI.openai.OpenAIError: AsyncEmbeddings.create() got an unexpected keyword argument modality
config file:
Relevant log output
Twitter / LinkedIn details
No response