Openai style api for open large language models, using LLMs just as chatgpt! Support for LLaMA, LLaMA-2, BLOOM, Falcon, Baichuan, Qwen, Xverse, SqlCoder, CodeLLaMA, ChatGLM, ChatGLM2, ChatGLM3 etc. 开源大模型的统一后端接口
提交前必须检查以下项目 | The following items must be checked before submission
[X] 请确保使用的是仓库最新代码(git pull),一些问题已被解决和修复。 | Make sure you are using the latest code from the repository (git pull), some issues have already been addressed and fixed.
[X] 我已阅读项目文档和FAQ章节并且已在Issue中对问题进行了搜索,没有找到相似问题和解决方案 | I have searched the existing issues / discussions
问题类型 | Type of problem
其他问题 | Other issues
操作系统 | Operating system
Linux
详细描述问题 | Detailed description of the problem
我注意到在最新代码中将不同的setting 分开之后,后面的setting有些字段取不到了
if "llm" in TASKS:
if ENGINE == "default":
PARENT_CLASSES.append(LLMSettings)
elif ENGINE == "vllm":
PARENT_CLASSES.extend([LLMSettings, VLLMSetting])
elif ENGINE == "llama.cpp":
PARENT_CLASSES.extend([LLMSettings, LlamaCppSetting])
elif ENGINE == "tgi":
PARENT_CLASSES.extend([LLMSettings, TGISetting])
if "rag" in TASKS:
PARENT_CLASSES.append(RAGSettings)
比如在/api-for-open-llm/api/utils/request.py 中的155行,if SETTINGS.interrupt_requests and llama_outer_lock.locked(): 如果ENGINE == "default",这个就会报错,这个字段在vllm ENGINE 中
Dependencies
No response
运行日志或截图 | Runtime logs or screenshots
+-+---------------- 1 ----------------
| Traceback (most recent call last):
| File "/workspace/api/utils/request.py", line 155, in get_event_publisher
| if SETTINGS.interrupt_requests and llama_outer_lock.locked():
| File "/usr/local/lib/python3.10/dist-packages/pydantic/main.py", line 759, in __getattr__
| raise AttributeError(f'{type(self).__name__!r} object has no attribute {item!r}')
| AttributeError: 'Settings' object has no attribute 'interrupt_requests'
+------------------------------------
提交前必须检查以下项目 | The following items must be checked before submission
问题类型 | Type of problem
其他问题 | Other issues
操作系统 | Operating system
Linux
详细描述问题 | Detailed description of the problem
我注意到在最新代码中将不同的setting 分开之后,后面的setting有些字段取不到了
比如在
/api-for-open-llm/api/utils/request.py
中的155行,if SETTINGS.interrupt_requests and llama_outer_lock.locked():
如果ENGINE == "default",这个就会报错,这个字段在vllm ENGINE 中Dependencies
No response
运行日志或截图 | Runtime logs or screenshots