This is a project that unifies the management of LLM APIs. It can call multiple backend services through a unified API interface, convert them to the OpenAI format uniformly, and support load balancing. Currently supported backend services include: OpenAI, Anthropic, DeepBricks, OpenRouter, Gemini, Vertex, etc.
现在model不用填可以自己去外部api获取很好,但是这样就没法用模型改名功能了,另外外部api可能有几个模型是不好用的,希望能手动设置禁掉它