InternLM / HuixiangDou

HuixiangDou: Overcoming Group Chat Scenarios with LLM-based Technical Assistance
BSD 3-Clause "New" or "Revised" License
1.12k stars 92 forks source link

fix(llm_server_hybrid.py): add magic prompt for kimi #289

Closed tpoisonooo closed 1 month ago

tpoisonooo commented 1 month ago

Since kimi API incompatible. During kimi intention parsing, use a very tricky magic prompt.