BerriAI / litellm

Python SDK, Proxy Server (LLM Gateway) to call 100+ LLM APIs in OpenAI format - [Bedrock, Azure, OpenAI, VertexAI, Cohere, Anthropic, Sagemaker, HuggingFace, Replicate, Groq]
https://docs.litellm.ai/docs/
Other
12.98k stars 1.52k forks source link

[Feature]: Add design policy document for huge utils.py maintenance #3345

Open nobu007 opened 5 months ago

nobu007 commented 5 months ago

The Feature(request)

How about to add design policy document? It makes easy to maintenance.

Content idea

The role of variables

model

The model name. For example. https://litellm.vercel.app/docs/providers/openrouter

custom_llm_provider

Provider name. Like openai, anthropic, openrouter.

In default, it is the first item of model. anthropic/claude-3-xx => anthropic openrouter/anthropic/claude-3-xx => openrouter

api_base

dynamic_api_key

Overall of decision flow

Priority order is here.

  1. model
  2. custom_llm_provider
  3. api_base
  4. dynamic_api_key

Decision flow of each variables

(It may over detail as design.)

model

  1. Input from parameter.

custom_llm_provider

  1. Input from parameter.
  2. If not set, xxx

api_base

  1. Input from parameter.
  2. If not set, xxx

dynamic_api_key

  1. llm_provider decides this.

Note: Why this variable exist? When should you use this?

Motivation

I have a questions about here. https://github.com/BerriAI/litellm/blob/main/litellm/utils.py

Memo for refactoring in future

get_llm_provider

This function decides "custom_llm_provider", "dynamic_api_key", "api_base".

TODO:

ishaan-jaff commented 5 months ago

contributions welcome on this