Python SDK, Proxy Server (LLM Gateway) to call 100+ LLM APIs in OpenAI format - [Bedrock, Azure, OpenAI, VertexAI, Cohere, Anthropic, Sagemaker, HuggingFace, Replicate, Groq]
14.6k
stars
1.71k
forks
source link
[Feature]: Add design policy document for huge utils.py maintenance #3345
Open
nobu007 opened 7 months ago
The Feature(request)
How about to add design policy document? It makes easy to maintenance.
Content idea
The role of variables
model
The model name. For example. https://litellm.vercel.app/docs/providers/openrouter
custom_llm_provider
Provider name. Like openai, anthropic, openrouter.
In default, it is the first item of model. anthropic/claude-3-xx => anthropic openrouter/anthropic/claude-3-xx => openrouter
api_base
dynamic_api_key
Overall of decision flow
Priority order is here.
Decision flow of each variables
(It may over detail as design.)
model
custom_llm_provider
api_base
dynamic_api_key
Note: Why this variable exist? When should you use this?
Motivation
I have a questions about here. https://github.com/BerriAI/litellm/blob/main/litellm/utils.py
Memo for refactoring in future
get_llm_provider
This function decides "custom_llm_provider", "dynamic_api_key", "api_base".
TODO:
Remove arg "api_key".
Remove retun value "dynamic_api_key"
Add notice comment for replacement api_base -> base_url. Ref: https://github.com/microsoft/autogen/pull/383/files