lm-sys / RouteLLM

A framework for serving and evaluating LLM routers - save LLM costs without compromising quality!
Apache License 2.0
2.78k stars 204 forks source link

Add LiteLLM support #11

Closed iojw closed 1 month ago

iojw commented 1 month ago

Integrates LiteLLM to support a wider range of model providers, including Anthropic, Amazon Bedrock, and Google AI Studio more easily.

We continue to support OpenAI-compatible endpoints using the --alt-base-url and --alt-api-key flags. To do so, prefix the model name with openai/ and it will be treated as an OpenAI compatible endpoint.