Closed Dingzhen778 closed 5 months ago
This issue has been marked as stale
, because it has been over 30 days without any activity.
This issue bas been closed, because it has been marked as stale
and there has been no activity for over 7 days.
Search before asking
- [x] I had searched in the issues and found no similar issues.
Operating system information
Linux
Python version information
3.10
DB-GPT version
main
Related scenes
- [x] Chat Data
- [ ] Chat Excel
- [ ] Chat DB
- [ ] Chat Knowledge
- [ ] Model Management
- [ ] Dashboard
- [ ] Plugins
Installation Information
- [x] Installation From Source
- [ ] Docker Installation
- [ ] Docker Compose Installation
- [ ] Cluster Installation
- [ ] AutoDL Image
- [ ] Other
Device information
GPU:A6000
Models information
LLM:fine-tuned Qwen-7B&Qwen-72B
What happened
I can just change LLM in .env via openai-api ,only one llm can show in GUI
What you expected to happen
I want to change llm in GUI not in .env, I have 4-5 LLM to inference at the same time,I want to change llm in GUI how should i edit .env?I had tried many ways, but only 1 LLM can use
How to reproduce
edit .env LLM_MODEL=proxyllm PROXYLLM_BACKEND=Qwen-72B-Chat(other LLMs) PROXY_SERVER_URL=http://xxxx:xxxx/v1/chat/completions(and other LLMs)
Additional context
No response
Are you willing to submit PR?
- [x] Yes I am willing to submit a PR!
你好👋,请问你有解决好这个问题吗,虚心请教
Search before asking
Operating system information
Linux
Python version information
3.10
DB-GPT version
main
Related scenes
Installation Information
[X] Installation From Source
[ ] Docker Installation
[ ] Docker Compose Installation
[ ] Cluster Installation
[ ] AutoDL Image
[ ] Other
Device information
GPU:A6000
Models information
LLM:fine-tuned Qwen-7B&Qwen-72B
What happened
I can just change LLM in .env via openai-api ,only one llm can show in GUI
What you expected to happen
I want to change llm in GUI not in .env, I have 4-5 LLM to inference at the same time,I want to change llm in GUI how should i edit .env?I had tried many ways, but only 1 LLM can use
How to reproduce
edit .env LLM_MODEL=proxyllm PROXYLLM_BACKEND=Qwen-72B-Chat(other LLMs) PROXY_SERVER_URL=http://xxxx:xxxx/v1/chat/completions(and other LLMs)
Additional context
No response
Are you willing to submit PR?