xusenlinzy / api-for-open-llm

Openai style api for open large language models, using LLMs just as chatgpt! Support for LLaMA, LLaMA-2, BLOOM, Falcon, Baichuan, Qwen, Xverse, SqlCoder, CodeLLaMA, ChatGLM, ChatGLM2, ChatGLM3 etc. 开源大模型的统一后端接口
Apache License 2.0
2.16k stars 252 forks source link

关于 api/config.py 中 SETTINGS = Settings()的 bug #262

Closed Tendo33 closed 2 months ago

Tendo33 commented 2 months ago

提交前必须检查以下项目 | The following items must be checked before submission

问题类型 | Type of problem

其他问题 | Other issues

操作系统 | Operating system

Linux

详细描述问题 | Detailed description of the problem

我注意到在最新代码中将不同的setting 分开之后,后面的setting有些字段取不到了

if "llm" in TASKS:
    if ENGINE == "default":
        PARENT_CLASSES.append(LLMSettings)
    elif ENGINE == "vllm":
        PARENT_CLASSES.extend([LLMSettings, VLLMSetting])
    elif ENGINE == "llama.cpp":
        PARENT_CLASSES.extend([LLMSettings, LlamaCppSetting])
    elif ENGINE == "tgi":
        PARENT_CLASSES.extend([LLMSettings, TGISetting])

if "rag" in TASKS:
    PARENT_CLASSES.append(RAGSettings)

比如在/api-for-open-llm/api/utils/request.py 中的155行,if SETTINGS.interrupt_requests and llama_outer_lock.locked(): 如果ENGINE == "default",这个就会报错,这个字段在vllm ENGINE 中

Dependencies

No response

运行日志或截图 | Runtime logs or screenshots

  +-+---------------- 1 ----------------
    | Traceback (most recent call last):
    |   File "/workspace/api/utils/request.py", line 155, in get_event_publisher
    |     if SETTINGS.interrupt_requests and llama_outer_lock.locked():
    |   File "/usr/local/lib/python3.10/dist-packages/pydantic/main.py", line 759, in __getattr__
    |     raise AttributeError(f'{type(self).__name__!r} object has no attribute {item!r}')
    | AttributeError: 'Settings' object has no attribute 'interrupt_requests'
    +------------------------------------
xusenlinzy commented 2 months ago

是的,已经修复了哈