netease-youdao / QAnything

Question and Answer based on Anything.
https://qanything.ai
Apache License 2.0
10.81k stars 1.04k forks source link

[BUG] <title>No 'Access-Control-Allow-Origin' header is present on the requested resource ,远程调用服务器部署的openai接口 #113

Open wujingbo-web opened 5 months ago

wujingbo-web commented 5 months ago

是否已有关于该错误的issue或讨论? | Is there an existing issue / discussion for this?

该问题是否在FAQ中有解答? | Is there an existing answer for this in FAQ?

当前行为 | Current Behavior

Access to fetch at 'http://100.161.35.42:8777/api/local_doc_qa/local_doc_chat' from origin 'http://100.161.35.42:5052' has been blocked by CORS policy: No 'Access-Control-Allow-Origin' header is present on the requested resource. If an opaque response serves your needs, set the request's mode to 'no-cors' to fetch the resource with CORS disabled. :8777/api/local_doc_qa/local_doc_chat:1 Failed to load resource: net::ERR_FAILED

期望行为 | Expected Behavior

允许跨域(跨计算机)调用自定义的openapi接口

运行环境 | Environment

- OS:Centos7.9
- NVIDIA Driver: 535.104.05
- CUDA:12.2
- Docker Compose:v2.24.4
- NVIDIA GPU Memory:24G*2

QAnything日志 | QAnything logs

浏览器日志:--------------------------

Access to fetch at 'http://100.161.35.42:8777/api/local_doc_qa/local_doc_chat' from origin 'http://100.161.35.42:5052' has been blocked by CORS policy: No 'Access-Control-Allow-Origin' header is present on the requested resource. If an opaque response serves your needs, set the request's mode to 'no-cors' to fetch the resource with CORS disabled. :8777/api/local_doc_qa/local_doc_chat:1 Failed to load resource: net::ERR_FAILED ----------------sanic_api.log ---------------------------------- rerank_port: 9001 embed_port: 9001 <Logger debug_logger (INFO)> <Logger qa_logger (INFO)> [2024-02-15 11:45:15 +0800] [834] [INFO] Sanic Extensions: INFO:sanic.root:Sanic Extensions: [2024-02-15 11:45:15 +0800] [834] [INFO] > injection [0 dependencies; 0 constants] INFO:sanic.root: > injection [0 dependencies; 0 constants] [2024-02-15 11:45:15 +0800] [834] [INFO] > openapi [http://0.0.0.0:8777/docs] INFO:sanic.root: > openapi [http://0.0.0.0:8777/docs] [2024-02-15 11:45:15 +0800] [834] [INFO] > http INFO:sanic.root: > http [2024-02-15 11:45:15 +0800] [834] [INFO] > templating [jinja2==3.1.3] INFO:sanic.root: > templating [jinja2==3.1.3] [2024-02-15 11:45:15 +0800] [831] [INFO] Starting worker [831] INFO:sanic.server:Starting worker [831] INFO:debug_logger:[SUCCESS] 数据库qanything检查通过 [2024-02-15 11:45:15 +0800] [835] [INFO] Starting worker [835] INFO:sanic.server:Starting worker [835] UPLOAD_ROOT_PATH: /workspace/qanything_local/QANY_DB/content llm_api_serve_port: None rerank_port: 9001 embed_port: 9001 <Logger debug_logger (INFO)> <Logger qa_logger (INFO)> [2024-02-15 11:45:15 +0800] [827] [INFO] Sanic Extensions: INFO:sanic.root:Sanic Extensions: [2024-02-15 11:45:15 +0800] [827] [INFO] > injection [0 dependencies; 0 constants] INFO:sanic.root: > injection [0 dependencies; 0 constants] [2024-02-15 11:45:15 +0800] [827] [INFO] > openapi [http://0.0.0.0:8777/docs] INFO:sanic.root: > openapi [http://0.0.0.0:8777/docs] [2024-02-15 11:45:15 +0800] [827] [INFO] > http INFO:sanic.root: > http [2024-02-15 11:45:15 +0800] [827] [INFO] > templating [jinja2==3.1.3] INFO:sanic.root: > templating [jinja2==3.1.3] INFO:debug_logger:ADD COLUMN timestamp INFO:debug_logger:1060 (42S21): Duplicate column name 'timestamp' INFO:debug_logger:[SUCCESS] 数据库qanything连接成功 init local_doc_qa in online INFO:debug_logger:[SUCCESS] 数据库qanything检查通过 INFO:debug_logger:[SUCCESS] 数据库qanything检查通过 INFO:debug_logger:ADD COLUMN timestamp INFO:debug_logger:1060 (42S21): Duplicate column name 'timestamp' INFO:debug_logger:[SUCCESS] 数据库qanything连接成功 init local_doc_qa in online [2024-02-15 11:45:15 +0800] [828] [INFO] Starting worker [828] INFO:sanic.server:Starting worker [828] INFO:debug_logger:ADD COLUMN timestamp INFO:debug_logger:1060 (42S21): Duplicate column name 'timestamp' INFO:debug_logger:[SUCCESS] 数据库qanything连接成功 init local_doc_qa in online [2024-02-15 11:45:15 +0800] [834] [INFO] Starting worker [834] INFO:sanic.server:Starting worker [834] [2024-02-15 11:45:15 +0800] [827] [INFO] Starting worker [827] INFO:sanic.server:Starting worker [827] INFO:debug_logger:list_kbs zzp INFO:debug_logger:all kb infos: [{'kb_id': 'KB1601cabb159c4a6b88db376d718b5fc7', 'kb_name': '默认知识库'}] INFO:debug_logger:local_doc_chat zzp INFO:debug_logger:rerank True INFO:debug_logger:history: [] INFO:debug_logger:question: 6000万的项目审批者是? INFO:debug_logger:kb_ids: ['KB1601cabb159c4a6b88db376d718b5fc7'] INFO:debug_logger:user_id: zzp INFO:debug_logger:check_kb_exist [('KB1601cabb159c4a6b88db376d718b5fc7',)] INFO:debug_logger:collection zzp exists INFO:debug_logger:partitions: ['KB1601cabb159c4a6b88db376d718b5fc7'] INFO:debug_logger:streaming: True INFO:debug_logger:start generate answer INFO:debug_logger:start generate... INFO:debug_logger:milvus group number: 1 INFO:debug_logger:milvus search time: 0.03197598457336426 INFO:root:Warning: model not found. Using cl100k_base encoding. [2024-02-15 11:45:39 +0800] [828] [ERROR] Exception occurred while handling uri: 'http://100.161.35.42:8777/api/local_doc_qa/local_doc_chat' Traceback (most recent call last): File "handle_request", line 132, in handle_request "_asgi_lifespan", File "/usr/local/lib/python3.10/dist-packages/sanic/response/types.py", line 547, in stream await self.streaming_fn(self) File "/workspace/qanything_local/qanything_kernel/qanything_server/handler.py", line 355, in generate_answer for resp, next_history in local_doc_qa.get_knowledge_based_answer( File "/workspace/qanything_local/qanything_kernel/core/local_doc_qa.py", line 227, in get_knowledge_based_answer source_documents = self.reprocess_source_documents(query=query, File "/workspace/qanything_local/qanything_kernel/core/local_doc_qa.py", line 151, in reprocess_source_documents query_token_num = self.llm.num_tokens_from_messages([query]) File "/workspace/qanything_local/qanything_kernel/connector/llm/llm_for_openai_api.py", line 95, in num_tokens_from_messages raise NotImplementedError( NotImplementedError: num_tokens_from_messages() is not implemented for model qwen-72b-int4. See https://github.com/openai/openai-python/blob/main/chatml.md for information on how messages are converted to tokens. ERROR:sanic.error:Exception occurred while handling uri: 'http://100.161.35.42:8777/api/local_doc_qa/local_doc_chat' Traceback (most recent call last): File "handle_request", line 132, in handle_request "_asgi_lifespan", File "/usr/local/lib/python3.10/dist-packages/sanic/response/types.py", line 547, in stream await self.streaming_fn(self) File "/workspace/qanything_local/qanything_kernel/qanything_server/handler.py", line 355, in generate_answer for resp, next_history in local_doc_qa.get_knowledge_based_answer( File "/workspace/qanything_local/qanything_kernel/core/local_doc_qa.py", line 227, in get_knowledge_based_answer source_documents = self.reprocess_source_documents(query=query, File "/workspace/qanything_local/qanything_kernel/core/local_doc_qa.py", line 151, in reprocess_source_documents query_token_num = self.llm.num_tokens_from_messages([query]) File "/workspace/qanything_local/qanything_kernel/connector/llm/llm_for_openai_api.py", line 95, in num_tokens_from_messages raise NotImplementedError( NotImplementedError: num_tokens_from_messages() is not implemented for model qwen-72b-int4. See https://github.com/openai/openai-python/blob/main/chatml.md for information on how messages are converted to tokens. (base) [root@localhost debug_logs]#

复现方法 | Steps To Reproduce

1.服务器本机为http://100.161.35.42:5052/qanything,远程调用云上服务器的openai接口:10.168.24.10:18081/v1,模型名字为 :qwen-72b-int4,结果报错,看浏览器的日志报错显示为不允许跨域访问。

Access to fetch at 'http://100.161.35.42:8777/api/local_doc_qa/local_doc_chat' from origin 'http://100.161.35.42:5052' has been blocked by CORS policy: No 'Access-Control-Allow-Origin' header is present on the requested resource. If an opaque response serves your needs, set the request's mode to 'no-cors' to fetch the resource with CORS disabled. :8777/api/local_doc_qa/local_doc_chat:1 Failed to load resource: net::ERR_FAILED

  1. 10.168.24.10上的openai服务接口是用vllm启动,参数如下,其他应用访问都是正常的可以对话: python3 -m vllm.entrypoints.openai.api_server --model /data1/model/Qwen1.5-72B-Chat-AWQ --served-model-name qwen-72b-int4 --trust-remote-code --max-model-len 9600 -q awq --dtype float16 -tp 1 --gpu-memory-utilization 0.85 --enforce-eager --host 10.168.24.10 --port 18081

备注 | Anything else?

No response

jingzl commented 5 months ago

我碰到同样的问题,报跨域错误

Cordy27 commented 5 months ago

碰到同样的问题,求解答

successren commented 3 months ago

this issue has been resolved, please see: #188