Closed boaty closed 1 month ago
You need to set API-key for your choosen LLM supplier.
I have a similar issue, and I confirm API-key has been set for aliyun. However, the graph could not be established because of "Request timed out"
[INFO] [2024-08-25 11:54:03,688] [_internal._log] [line:96]: 172.18.0.5 - - [25/Aug/2024 11:54:03] "GET /v1/document/list?kb_id=71a4f66e622211efa8320242ac120006&page=1&page_size=10 HTTP/1.1" 200 - [ERROR] [2024-08-25 11:54:10,550] [http_request._handle_response] [line:172]: Request: https://dashscope.aliyuncs.com/api/v1/services/aigc/text-generation/generation failed, status: 500, message: Request timed out, please try again later. Traceback (most recent call last): File "/ragflow/graphrag/graph_extractor.py", line 139, in call result, token_count = self._process_document(text, prompt_variables) File "/ragflow/graphrag/graph_extractor.py", line 188, in _process_document if response.find("ERROR") >=0: raise Exception(response) Exception: ERROR: Request timed out, please try again later.
During handling of the above exception, another exception occurred:
Traceback (most recent call last): File "/ragflow/rag/svr/task_executor.py", line 165, in build cks = chunker.chunk(row["name"], binary=binary, from_page=row["from_page"], File "/ragflow/rag/app/knowledge_graph.py", line 18, in chunk chunks = build_knowlege_graph_chunks(tenant_id, sections, callback, File "/ragflow/graphrag/index.py", line 85, in build_knowlege_graphchunks graphs.append(.result().output) File "/usr/lib/python3.10/concurrent/futures/_base.py", line 458, in result return self.get_result() File "/usr/lib/python3.10/concurrent/futures/_base.py", line 403, in get_result raise self._exception File "/usr/lib/python3.10/concurrent/futures/thread.py", line 58, in run result = self.fn(*self.args, **self.kwargs) File "/ragflow/graphrag/graph_extractor.py", line 145, in call if callback: callback("Knowledge graph extraction error:{}".format(str(e))) File "/ragflow/rag/svr/task_executor.py", line 80, in set_progress if prog is not None and prog < 0: TypeError: '<' not supported between instances of 'str' and 'int'
I changed the "system model settings" and it works:
top right corner-your icon, choose "Model Provider" from left check "system Model settings" Change the "Embedding model"- this is the key step
prog
And this is not the latest version of code.
Is there an existing issue for the same bug?
Branch name
knowledgebase chunk method knowledge graph
Commit ID
2024-08-22
Other environment information
Actual behavior
I got error while building database of knowledge graph type.
error message is related to NO API key for aliyuncs.com
i tried different embedding methods and minitor the docker logs, i found the error occurred after embedding part is done. So bug must be in chunk method
Expected behavior
knowledge graph built and we can see it in dataset by clicking the file name.
Steps to reproduce
Additional information
[ERROR] [2024-08-22 07:16:53,692] [http_request._handle_response] [line:172]: Request: https://dashscope.aliyuncs.com/api/v1/services/aigc/text-generation/generation failed, status: 401, message: Invalid API-key provided. [ERROR] [2024-08-22 07:16:54,744] [http_request._handle_response] [line:172]: Request: https://dashscope.aliyuncs.com/api/v1/services/aigc/text-generation/generation failed, status: 401, message: Invalid API-key provided. [ERROR] [2024-08-22 07:16:55,454] [http_request._handle_response] [line:172]: Request: https://dashscope.aliyuncs.com/api/v1/services/aigc/text-generation/generation failed, status: 401, message: Invalid API-key provided. [ERROR] [2024-08-22 07:16:56,271] [http_request._handle_response] [line:172]: Request: https://dashscope.aliyuncs.com/api/v1/services/embeddings/text-embedding/text-embedding failed, status: 401, message: Invalid API-key provided. /ragflow/rag/utils/es_conn.py:131: ElasticsearchWarning: A bulk action object contained multiple keys. Additional keys are currently ignored but will be rejected in a future version. r = self.es.bulk(index=(self.idxnm if not idx_nm else