issues
search
InternLM
/
HuixiangDou
HuixiangDou: Overcoming Group Chat Scenarios with LLM-based Technical Assistance
BSD 3-Clause "New" or "Revised" License
1.13k
stars
93
forks
source link
fix(web/proxy): backend use `remote`
#220
Closed
tpoisonooo
closed
2 months ago
tpoisonooo
commented
2 months ago
fix max_length when
enable_local=1
and
enable_remote=1
web server support use remote LLM only (for example kimi)
fix model name in
call_puyu
enable_local=1
andenable_remote=1
call_puyu