Openai style api for open large language models, using LLMs just as chatgpt! Support for LLaMA, LLaMA-2, BLOOM, Falcon, Baichuan, Qwen, Xverse, SqlCoder, CodeLLaMA, ChatGLM, ChatGLM2, ChatGLM3 etc. 开源大模型的统一后端接口
提交前必须检查以下项目 | The following items must be checked before submission
[X] 请确保使用的是仓库最新代码(git pull),一些问题已被解决和修复。 | Make sure you are using the latest code from the repository (git pull), some issues have already been addressed and fixed.
[X] 我已阅读项目文档和FAQ章节并且已在Issue中对问题进行了搜索,没有找到相似问题和解决方案 | I have searched the existing issues / discussions
问题类型 | Type of problem
None
操作系统 | Operating system
Linux
详细描述问题 | Detailed description of the problem
streamlit run streamlit_app.py
Dependencies
# 请在此处粘贴依赖情况
# Please paste the dependencies here
运行日志或截图 | Runtime logs or screenshots
File "/data/WCY/anaconda3/envs/api/lib/python3.11/site-packages/streamlit/runtime/scriptrunner/exec_code.py", line 88, in exec_func_with_error_handling
result = func()
^^^^^^
File "/data/WCY/anaconda3/envs/api/lib/python3.11/site-packages/streamlit/runtime/scriptrunner/script_runner.py", line 590, in code_to_exec
exec(code, module.dict)
File "/data/WCY/api-for-open-llm-master/streamlit-demo/streamlit_app.py", line 83, in
main()
File "/data/WCY/api-for-open-llm-master/streamlit-demo/streamlit_app.py", line 10, in main
from streamlit_gallery.components import chat, doc_chat
File "/data/WCY/api-for-open-llm-master/streamlit-demo/streamlit_gallery/components/init.py", line 2, in
from .doc_chat.streamlit_app import main as doc_chat
File "/data/WCY/api-for-open-llm-master/streamlit-demo/streamlit_gallery/components/doc_chat/streamlit_app.py", line 9, in
from .utils import DocServer, DOCQA_PROMPT
File "/data/WCY/api-for-open-llm-master/streamlit-demo/streamlit_gallery/components/doc_chat/utils.py", line 18, in
os.environ["CO_API_URL"] = EMBEDDING_API_BASE
File "<frozen os>", line 684, in __setitem__
File "<frozen os>", line 758, in encode
提交前必须检查以下项目 | The following items must be checked before submission
问题类型 | Type of problem
None
操作系统 | Operating system
Linux
详细描述问题 | Detailed description of the problem
streamlit run streamlit_app.py
Dependencies
运行日志或截图 | Runtime logs or screenshots
File "/data/WCY/anaconda3/envs/api/lib/python3.11/site-packages/streamlit/runtime/scriptrunner/exec_code.py", line 88, in exec_func_with_error_handling result = func() ^^^^^^ File "/data/WCY/anaconda3/envs/api/lib/python3.11/site-packages/streamlit/runtime/scriptrunner/script_runner.py", line 590, in code_to_exec exec(code, module.dict) File "/data/WCY/api-for-open-llm-master/streamlit-demo/streamlit_app.py", line 83, in
main()
File "/data/WCY/api-for-open-llm-master/streamlit-demo/streamlit_app.py", line 10, in main
from streamlit_gallery.components import chat, doc_chat
File "/data/WCY/api-for-open-llm-master/streamlit-demo/streamlit_gallery/components/init.py", line 2, in
from .doc_chat.streamlit_app import main as doc_chat
File "/data/WCY/api-for-open-llm-master/streamlit-demo/streamlit_gallery/components/doc_chat/streamlit_app.py", line 9, in
from .utils import DocServer, DOCQA_PROMPT
File "/data/WCY/api-for-open-llm-master/streamlit-demo/streamlit_gallery/components/doc_chat/utils.py", line 18, in
os.environ["CO_API_URL"] = EMBEDDING_API_BASE