chatchat-space / Langchain-Chatchat

Langchain-Chatchat(原Langchain-ChatGLM)基于 Langchain 与 ChatGLM, Qwen 与 Llama 等语言模型的 RAG 与 Agent 应用 | Langchain-Chatchat (formerly langchain-ChatGLM), local knowledge based LLM (like ChatGLM, Qwen and Llama) RAG and Agent app with langchain
Apache License 2.0
31.29k stars 5.45k forks source link

用windows11去运行langchain0.3版本初始化知识库报错chatchat-kb -r #4382

Closed Qi0716 closed 2 months ago

Qi0716 commented 2 months ago

问题描述 / Problem Description 用简洁明了的语言描述这个问题 / Describe the problem in a clear and concise manner. 用windows11去运行langchain0.3版本初始化知识库报错chatchat-kb -r

**复现问题的步骤 / chatchat-kb -r 问题出现 / Problem occurs

预期的结果 / Expected Result 描述应该出现的结果 / Describe the expected result.


知识库名称 :samples 知识库类型 :faiss 向量模型: :bge-large-zh-v1.5 知识库路径 :/root/anaconda3/envs/chatchat/lib/python3.11/site-packages/chatchat/data/knowledge_base/samples 文件总数量 :47 入库文件数 :42 知识条目数 :740 用时 :0:02:29.701002

总计用时 :0:02:33.414425

实际结果 / Actual Result 描述实际发生的结果 / Describe the actual result.

(langchain) D:\other\Langchain-Chatchat-master>chatchat-kb -r
recreating all vector stores
C:\Users\30759.conda\envs\langchain\lib\site-packages\langchain_api\module_import.py:87: LangChainDeprecationWarning: Importing GuardrailsOutputParser from langchain.output_parsers is deprecated. Please replace the import with the following: from langchain_community.output_parsers.rail_parser import GuardrailsOutputParser warnings.warn( 2024-07-03 09:28:17,524 - utils.py[line:260] - ERROR: failed to create Embeddings for model: bge-large-zh-v1.5. Traceback (most recent call last): File "C:\Users\30759.conda\envs\langchain\lib\site-packages\chatchat\server\utils.py", line 258, in get_Embeddings return LocalAIEmbeddings(params) File "C:\Users\30759.conda\envs\langchain\lib\site-packages\pydantic\v1\main.py", line 341, in init raise validation_error pydantic.v1.error_wrappers.ValidationError: 1 validation error for LocalAIEmbeddings root Did not find openai_api_key, please add an environment variable OPENAI_API_KEY which contains it, or pass openai_api_key as a named parameter. (type=value_error) 2024-07-03 09:28:17,537 - faiss_cache.py[line:140] - ERROR: 'NoneType' object has no attribute 'embed_documents' Traceback (most recent call last): File "C:\Users\30759.conda\envs\langchain\lib\site-packages\chatchat\server\knowledge_base\kb_cache\faiss_cache.py", line 126, in load_vector_store vector_store = self.new_vector_store( File "C:\Users\30759.conda\envs\langchain\lib\site-packages\chatchat\server\knowledge_base\kb_cache\faiss_cache.py", line 63, in new_vector_store vector_store = FAISS.from_documents([doc], embeddings, normalize_L2=True) File "C:\Users\30759.conda\envs\langchain\lib\site-packages\langchain_core\vectorstores.py", line 550, in from_documents return cls.from_texts(texts, embedding, metadatas=metadatas, kwargs) File "C:\Users\30759.conda\envs\langchain\lib\site-packages\langchain_community\vectorstores\faiss.py", line 930, in from_texts embeddings = embedding.embed_documents(texts) AttributeError: 'NoneType' object has no attribute 'embed_documents' AttributeError: 'NoneType' object has no attribute 'embed_documents'

During handling of the above exception, another exception occurred:

Traceback (most recent call last): File "C:\Users\30759.conda\envs\langchain\lib\site-packages\chatchat\init_database.py", line 129, in main folder2db( File "C:\Users\30759.conda\envs\langchain\lib\site-packages\chatchat\server\knowledge_base\migrate.py", line 157, in folder2db kb.create_kb() File "C:\Users\30759.conda\envs\langchain\lib\site-packages\chatchat\server\knowledge_base\kb_service\base.py", line 102, in create_kb self.do_create_kb() File "C:\Users\30759.conda\envs\langchain\lib\site-packages\chatchat\server\knowledge_base\kb_service\faiss_kb_service.py", line 57, in do_create_kb self.load_vector_store() File "C:\Users\30759.conda\envs\langchain\lib\site-packages\chatchat\server\knowledge_base\kb_service\faiss_kb_service.py", line 32, in load_vector_store return kb_faiss_pool.load_vector_store( File "C:\Users\30759.conda\envs\langchain\lib\site-packages\chatchat\server\knowledge_base\kb_cache\faiss_cache.py", line 141, in load_vector_store raise RuntimeError(f"向量库 {kb_name} 加载失败。") RuntimeError: 向量库 111 加载失败。 2024-07-03 09:28:17,580 - init_database.py[line:151] - WARNING: Caught KeyboardInterrupt! Setting stop event...

环境信息 / Environment Information

附加信息 / Additional Information 添加与问题相关的任何其他信息 / Add any other information related to the issue.

zhaozhizhuo commented 2 months ago

请问解决了吗,我也遇到了这个问题

Qi0716 commented 2 months ago

已解决

zhaozhizhuo commented 2 months ago

请问如何解决这个问题呀

zhaozhizhuo commented 2 months ago

我的报错是recreating all vector stores /home/zhaozhizhuo22/.conda/envs/llm_tl/lib/python3.10/site-packages/langchain/_api/module_import.py:87: LangChainDeprecationWarning: Importing GuardrailsOutputParser from langchain.output_parsers is deprecated. Please replace the import with the following: from langchain_community.output_parsers.rail_parser import GuardrailsOutputParser warnings.warn( 2024-07-05 17:03:45,456 - utils.py[line:260] - ERROR: failed to create Embeddings for model: custom. Traceback (most recent call last): File "/home/zhaozhizhuo22/.conda/envs/llm_tl/lib/python3.10/site-packages/chatchat/server/utils.py", line 258, in get_Embeddings return LocalAIEmbeddings(params) File "/home/zhaozhizhuo22/.conda/envs/llm_tl/lib/python3.10/site-packages/pydantic/v1/main.py", line 341, in init raise validation_error pydantic.v1.error_wrappers.ValidationError: 1 validation error for LocalAIEmbeddings root Did not find openai_api_key, please add an environment variable OPENAI_API_KEY which contains it, or pass openai_api_key as a named parameter. (type=value_error) 2024-07-05 17:03:45,457 - faiss_cache.py[line:140] - ERROR: 'NoneType' object has no attribute 'embed_documents' Traceback (most recent call last): File "/home/zhaozhizhuo22/.conda/envs/llm_tl/lib/python3.10/site-packages/chatchat/server/knowledge_base/kb_cache/faiss_cache.py", line 126, in load_vector_store vector_store = self.new_vector_store( File "/home/zhaozhizhuo22/.conda/envs/llm_tl/lib/python3.10/site-packages/chatchat/server/knowledge_base/kb_cache/faiss_cache.py", line 63, in new_vector_store vector_store = FAISS.from_documents([doc], embeddings, normalize_L2=True) File "/home/zhaozhizhuo22/.conda/envs/llm_tl/lib/python3.10/site-packages/langchain_core/vectorstores.py", line 550, in from_documents return cls.from_texts(texts, embedding, metadatas=metadatas, kwargs) File "/home/zhaozhizhuo22/.conda/envs/llm_tl/lib/python3.10/site-packages/langchain_community/vectorstores/faiss.py", line 930, in from_texts embeddings = embedding.embed_documents(texts) AttributeError: 'NoneType' object has no attribute 'embed_documents' 2024-07-05 17:03:45,458 - init_database.py[line:150] - ERROR: 向量库 samples 加载失败。 Traceback (most recent call last): File "/home/zhaozhizhuo22/.conda/envs/llm_tl/lib/python3.10/site-packages/chatchat/server/knowledge_base/kb_cache/faiss_cache.py", line 126, in load_vector_store vector_store = self.new_vector_store( File "/home/zhaozhizhuo22/.conda/envs/llm_tl/lib/python3.10/site-packages/chatchat/server/knowledge_base/kb_cache/faiss_cache.py", line 63, in new_vector_store vector_store = FAISS.from_documents([doc], embeddings, normalize_L2=True) File "/home/zhaozhizhuo22/.conda/envs/llm_tl/lib/python3.10/site-packages/langchain_core/vectorstores.py", line 550, in from_documents return cls.from_texts(texts, embedding, metadatas=metadatas, **kwargs) File "/home/zhaozhizhuo22/.conda/envs/llm_tl/lib/python3.10/site-packages/langchain_community/vectorstores/faiss.py", line 930, in from_texts embeddings = embedding.embed_documents(texts) AttributeError: 'NoneType' object has no attribute 'embed_documents'

During handling of the above exception, another exception occurred:

Traceback (most recent call last): File "/home/zhaozhizhuo22/.conda/envs/llm_tl/lib/python3.10/site-packages/chatchat/init_database.py", line 129, in main folder2db( File "/home/zhaozhizhuo22/.conda/envs/llm_tl/lib/python3.10/site-packages/chatchat/server/knowledge_base/migrate.py", line 157, in folder2db kb.create_kb() File "/home/zhaozhizhuo22/.conda/envs/llm_tl/lib/python3.10/site-packages/chatchat/server/knowledge_base/kb_service/base.py", line 102, in create_kb self.do_create_kb() File "/home/zhaozhizhuo22/.conda/envs/llm_tl/lib/python3.10/site-packages/chatchat/server/knowledge_base/kb_service/faiss_kb_service.py", line 57, in do_create_kb self.load_vector_store() File "/home/zhaozhizhuo22/.conda/envs/llm_tl/lib/python3.10/site-packages/chatchat/server/knowledge_base/kb_service/faiss_kb_service.py", line 32, in load_vector_store return kb_faiss_pool.load_vector_store( File "/home/zhaozhizhuo22/.conda/envs/llm_tl/lib/python3.10/site-packages/chatchat/server/knowledge_base/kb_cache/faiss_cache.py", line 141, in load_vector_store raise RuntimeError(f"向量库 {kb_name} 加载失败。") RuntimeError: 向量库 samples 加载失败。 2024-07-05 17:03:45,459 - init_database.py[line:151] - WARNING: Caught KeyboardInterrupt! Setting stop event...

annsshadow commented 2 months ago

已解决

同已解决

annsshadow commented 2 months ago

请问如何解决这个问题呀

同已解决

liunux4odoo commented 2 months ago

0.3.1 版已经发布,优化了配置方式,修改配置项无需重启服务器,可以更新尝试。

Staudinger0325 commented 1 month ago

你好,请问是如何解决的呢?