InternLM / HuixiangDou

HuixiangDou: Overcoming Group Chat Scenarios with LLM-based Technical Assistance
BSD 3-Clause "New" or "Revised" License
1.13k stars 93 forks source link

feat(config.ini): kimi support auto #243

Closed tpoisonooo closed 2 months ago

tpoisonooo commented 2 months ago

If kimi model name use auto, llm_server_hybrid.py would auto select model by prompt lenght.