BerriAI / litellm

Python SDK, Proxy Server (LLM Gateway) to call 100+ LLM APIs in OpenAI format - [Bedrock, Azure, OpenAI, VertexAI, Cohere, Anthropic, Sagemaker, HuggingFace, Replicate, Groq]
https://docs.litellm.ai/docs/
Other
14.07k stars 1.66k forks source link

[Feature]: Support OpenRouter's "provider" argument to control/select providers #6857

Open paul-gauthier opened 14 hours ago

paul-gauthier commented 14 hours ago

The Feature

OpenRouter supports a variety of mechanisms to select which providers you want your requests to hit. This involves passing a provider argument. Currently that causes an error:

import litellm

model = "openrouter/qwen/qwen-2.5-coder-32b-instruct"

messages=[{"role": "user", "content": "say 'hi!'"}]
comp = litellm.completion(
    model=model,
    messages=messages,
    provider=dict(),
)
print(comp)

###

litellm.exceptions.APIError: litellm.APIError: APIError: OpenrouterException - Completions.create() got an unexpected keyword argument 'provider'

Motivation, pitch

This is important because OpenRouter load balances across multiple providers with different prices, context windows, quantizations, etc. It is often important to control which providers are used.

Twitter / LinkedIn details

No response