xusenlinzy / api-for-open-llm

Openai style api for open large language models, using LLMs just as chatgpt! Support for LLaMA, LLaMA-2, BLOOM, Falcon, Baichuan, Qwen, Xverse, SqlCoder, CodeLLaMA, ChatGLM, ChatGLM2, ChatGLM3 etc. 开源大模型的统一后端接口
Apache License 2.0
2.16k stars 252 forks source link

"POST /v1/files HTTP/1.1" 404 Not Found #286

Closed KEAI404 closed 3 weeks ago

KEAI404 commented 3 weeks ago

提交前必须检查以下项目 | The following items must be checked before submission

问题类型 | Type of problem

模型推理和部署 | Model inference and deployment

操作系统 | Operating system

Windows

详细描述问题 | Detailed description of the problem

在使用doc_chat 时报错

upf = client.files.create(file=open(filepath, "rb"), purpose="assistants")

Traceback (most recent call last): File "C:\Users\me.conda\envs\lib\site-packages\streamlit\runtime\scriptrunner\script_runner.py", line 600, in _run_script exec(code, module.dict) File "E:\streamlit-demo\streamlit_app.py", line 67, in main() File "E:\streamlit-demo\streamlit_app.py", line 62, in main page.show() File "E:\streamlit-demo\streamlit_gallery\utils\page.py", line 49, in show self._selected() File "E:\streamlit-demo\streamlit_gallery\components\doc_chat\streamlit_app.py", line 110, in main create_file_index( File "C:\Users\me.conda\envs\lib\site-packages\streamlit\runtime\caching\cache_utils.py", line 165, in wrapper return cached_func(*args, *kwargs) File "C:\Users\me.conda\envs\lib\site-packages\streamlit\runtime\caching\cache_utils.py", line 194, in call return self._get_or_create_cached_value(args, kwargs) File "C:\Users\me.conda\envs\lib\site-packages\streamlit\runtime\caching\cache_utils.py", line 221, in _get_or_create_cached_value return self._handle_cache_miss(cache, value_key, func_args, func_kwargs) File "C:\Users\me.conda\envs\lib\site-packages\streamlit\runtime\caching\cache_utils.py", line 277, in _handle_cache_miss computed_value = self._info.func(func_args, **func_kwargs) File "E:\streamlit-demo\streamlit_gallery\components\doc_chat\streamlit_app.py", line 48, in create_file_index file_id = server.upload( File "E:\streamlit-demo\streamlit_gallery\components\doc_chat\utils.py", line 73, in upload upf = self.client.files.create(file=open(filepath, "rb"), purpose="assistants") File "C:\Users\me.conda\envs\lib\site-packages\openai\resources\files.py", line 113, in create return self._post( File "C:\Users\me.conda\envs\lib\site-packages\openai_base_client.py", line 1240, in post return cast(ResponseT, self.request(cast_to, opts, stream=stream, stream_cls=stream_cls)) File "C:\Users\me.conda\envs\lib\site-packages\openai_base_client.py", line 921, in request return self._request( File "C:\Users\me.conda\envs\lib\site-packages\openai_base_client.py", line 1020, in _request raise self._make_status_error_from_response(err.response) from None openai.NotFoundError: Error code: 404 - {'detail': 'Not Found'}

Dependencies

# 请在此处粘贴依赖情况
# Please paste the dependencies here

运行日志或截图 | Runtime logs or screenshots

No response

xusenlinzy commented 3 weeks ago

看看启动模型的时候环境变量是不是TASKS=llm,rag