GaiZhenbiao / ChuanhuChatGPT

GUI for ChatGPT API and many LLMs. Supports agents, file-based QA, GPT finetuning and query with web search. All with a neat UI.
https://huggingface.co/spaces/JohnSmith9982/ChuanhuChatGPT
GNU General Public License v3.0
15.15k stars 2.28k forks source link

[Docker]: docker版本无法正确就文件进行提问 #798

Open sebastian0619 opened 1 year ago

sebastian0619 commented 1 year ago

是否已存在现有反馈与解答?

是否是一个代理配置相关的疑问?

错误描述

  1. 初次docker部署后,在尝试与文件进行对话时,提示错误。错误日志提示需要安装"sentence_transformers"这个module。个人的解决方案为直接在docker容器内进入bash,pip install sentence_transformers,未修改dockerfile或者requirement.txt。如此处理后文件可以正常拖放,并且建立索引。
  2. 正常按流程复现操作会提示一个和choices这个模块有问题的日志(错误日志1),但是我我通过pip install --upgrade langchain (解决了?)后续出现了下边的错误日志2

复现操作

  1. 正常完成docker部署并设置好配置文件
  2. 拖放pdf文件(出现问题1)
  3. 与pdf文件对话(出现问题2)

错误日志

错误日志1:

Traceback (most recent call last):
  File "/root/.local/lib/python3.9/site-packages/gradio/routes.py", line 414, in run_predict
    output = await app.get_blocks().process_api(
  File "/root/.local/lib/python3.9/site-packages/gradio/blocks.py", line 1320, in process_api
    result = await self.call_function(
  File "/root/.local/lib/python3.9/site-packages/gradio/blocks.py", line 1048, in call_function
    prediction = await anyio.to_thread.run_sync(
  File "/root/.local/lib/python3.9/site-packages/anyio/to_thread.py", line 33, in run_sync
    return await get_asynclib().run_sync_in_worker_thread(
  File "/root/.local/lib/python3.9/site-packages/anyio/_backends/_asyncio.py", line 877, in run_sync_in_worker_thread
    return await future
  File "/root/.local/lib/python3.9/site-packages/anyio/_backends/_asyncio.py", line 807, in run
    result = context.run(func, *args)
  File "/app/modules/utils.py", line 120, in handle_summarize_index
    return current_model.summarize_index(*args)
  File "/app/modules/models/base_model.py", line 304, in summarize_index
    summary = chain({"input_documents": list(index.docstore.__dict__["_dict"].values())}, return_only_outputs=True)["output_text"]
  File "/root/.local/lib/python3.9/site-packages/langchain/chains/base.py", line 140, in __call__
    raise e
  File "/root/.local/lib/python3.9/site-packages/langchain/chains/base.py", line 134, in __call__
    self._call(inputs, run_manager=run_manager)
  File "/root/.local/lib/python3.9/site-packages/langchain/chains/combine_documents/base.py", line 84, in _call
    output, extra_return_dict = self.combine_docs(
  File "/root/.local/lib/python3.9/site-packages/langchain/chains/combine_documents/map_reduce.py", line 144, in combine_docs
    results = self.llm_chain.apply(
  File "/root/.local/lib/python3.9/site-packages/langchain/chains/llm.py", line 157, in apply
    raise e
  File "/root/.local/lib/python3.9/site-packages/langchain/chains/llm.py", line 154, in apply
    response = self.generate(input_list, run_manager=run_manager)
  File "/root/.local/lib/python3.9/site-packages/langchain/chains/llm.py", line 79, in generate
    return self.llm.generate_prompt(
  File "/root/.local/lib/python3.9/site-packages/langchain/chat_models/base.py", line 143, in generate_prompt
    return self.generate(prompt_messages, stop=stop, callbacks=callbacks)
  File "/root/.local/lib/python3.9/site-packages/langchain/chat_models/base.py", line 91, in generate
    raise e
  File "/root/.local/lib/python3.9/site-packages/langchain/chat_models/base.py", line 83, in generate
    results = [
  File "/root/.local/lib/python3.9/site-packages/langchain/chat_models/base.py", line 84, in <listcomp>
    self._generate(m, stop=stop, run_manager=run_manager)
  File "/root/.local/lib/python3.9/site-packages/langchain/chat_models/openai.py", line 297, in _generate
    return self._create_chat_result(response)
  File "/root/.local/lib/python3.9/site-packages/langchain/chat_models/openai.py", line 312, in _create_chat_result
    for res in response["choices"]:
KeyError: 'choices'

错误日志2:
2023-06-08 23:20:59,467 [INFO] [base_model.py:632] filename: history/jingtian/2023-06-05_14-15-45.json
2023-06-08 23:21:03,749 [INFO] [index_func.py:123] 找到了缓存的索引文件,加载中……
2023-06-08 23:21:03,750 [INFO] [loader.py:64] Loading faiss.
2023-06-08 23:21:03,790 [INFO] [loader.py:66] Successfully loaded faiss.
2023-06-08 23:21:06,040 [INFO] [index_func.py:123] 找到了缓存的索引文件,加载中……
2023-06-08 23:21:06,042 [INFO] [base_model.py:294] Generating content summary...
Traceback (most recent call last):
  File "/root/.local/lib/python3.9/site-packages/gradio/routes.py", line 414, in run_predict
    output = await app.get_blocks().process_api(
  File "/root/.local/lib/python3.9/site-packages/gradio/blocks.py", line 1320, in process_api
    result = await self.call_function(
  File "/root/.local/lib/python3.9/site-packages/gradio/blocks.py", line 1048, in call_function
    prediction = await anyio.to_thread.run_sync(
  File "/root/.local/lib/python3.9/site-packages/anyio/to_thread.py", line 33, in run_sync
    return await get_asynclib().run_sync_in_worker_thread(
  File "/root/.local/lib/python3.9/site-packages/anyio/_backends/_asyncio.py", line 877, in run_sync_in_worker_thread
    return await future
  File "/root/.local/lib/python3.9/site-packages/anyio/_backends/_asyncio.py", line 807, in run
    result = context.run(func, *args)
  File "/app/modules/utils.py", line 120, in handle_summarize_index
    return current_model.summarize_index(*args)
  File "/app/modules/models/base_model.py", line 304, in summarize_index
    summary = chain({"input_documents": list(index.docstore.__dict__["_dict"].values())}, return_only_outputs=True)["output_text"]
  File "/usr/local/lib/python3.9/site-packages/langchain/chains/base.py", line 140, in __call__
    raise e
  File "/usr/local/lib/python3.9/site-packages/langchain/chains/base.py", line 134, in __call__
    self._call(inputs, run_manager=run_manager)
  File "/usr/local/lib/python3.9/site-packages/langchain/chains/combine_documents/base.py", line 84, in _call
    output, extra_return_dict = self.combine_docs(
  File "/usr/local/lib/python3.9/site-packages/langchain/chains/combine_documents/map_reduce.py", line 144, in combine_docs
    results = self.llm_chain.apply(
  File "/usr/local/lib/python3.9/site-packages/langchain/chains/llm.py", line 157, in apply
    raise e
  File "/usr/local/lib/python3.9/site-packages/langchain/chains/llm.py", line 154, in apply
    response = self.generate(input_list, run_manager=run_manager)
  File "/usr/local/lib/python3.9/site-packages/langchain/chains/llm.py", line 79, in generate
    return self.llm.generate_prompt(
  File "/usr/local/lib/python3.9/site-packages/langchain/chat_models/base.py", line 143, in generate_prompt
    return self.generate(prompt_messages, stop=stop, callbacks=callbacks)
  File "/usr/local/lib/python3.9/site-packages/langchain/chat_models/base.py", line 91, in generate
    raise e
  File "/usr/local/lib/python3.9/site-packages/langchain/chat_models/base.py", line 83, in generate
    results = [
  File "/usr/local/lib/python3.9/site-packages/langchain/chat_models/base.py", line 84, in <listcomp>
    self._generate(m, stop=stop, run_manager=run_manager)
  File "/usr/local/lib/python3.9/site-packages/langchain/chat_models/openai.py", line 297, in _generate

运行环境

补充说明

No response

YANG301 commented 4 months ago

我在linux遇到了与你一样的问题,解决方法也与你一致。我的docker也是直接拉的最新的镜像。 这是我的报错 ` 2024-05-03 11:29:15,628 [INFO] [index_func.py:29] loading file: 基于YOLOv5Gradio的在线多功能视频图像检测系统.docx /root/.local/lib/python3.10/site-packages/langchain/document_loaders/init.py:36: LangChainDeprecationWarning: Importing document loaders from langchain is deprecated. Importing from langchain will no longer be supported as of langchain==0.2.0. Please import from langchain-community instead:

from langchain_community.document_loaders import UnstructuredWordDocumentLoader.

To install langchain-community run pip install -U langchain-community. warnings.warn( 2024-05-03 11:29:21,299 [ERROR] [index_func.py:81] Error loading file: 基于YOLOv5Gradio的在线多功能视频图像检测系统.docx Traceback (most recent call last): File "/root/.local/lib/python3.10/site-packages/unstructured/nlp/tokenize.py", line 21, in _download_nltk_package_if_not_present nltk.find(f"{package_category}/{package_name}") File "/root/.local/lib/python3.10/site-packages/nltk/data.py", line 583, in find raise LookupError(resource_not_found) LookupError:


Resource punkt not found. Please use the NLTK Downloader to obtain the resource:

>>> import nltk

nltk.download('punkt')  For more information see: https://www.nltk.org/data.html

Attempted to load tokenizers/punkt

Searched in:

During handling of the above exception, another exception occurred:

Traceback (most recent call last): File "/app/modules/index_func.py", line 53, in get_documents texts = loader.load() File "/root/.local/lib/python3.10/site-packages/langchain_core/document_loaders/base.py", line 29, in load return list(self.lazy_load()) File "/root/.local/lib/python3.10/site-packages/langchain_community/document_loaders/unstructured.py", line 88, in lazy_load elements = self._get_elements() File "/root/.local/lib/python3.10/site-packages/langchain_community/document_loaders/word_document.py", line 125, in _get_elements return partition_docx(filename=self.file_path, self.unstructured_kwargs) File "/root/.local/lib/python3.10/site-packages/unstructured/documents/elements.py", line 539, in wrapper elements = func(*args, *kwargs) File "/root/.local/lib/python3.10/site-packages/unstructured/file_utils/filetype.py", line 622, in wrapper elements = func(args, kwargs) File "/root/.local/lib/python3.10/site-packages/unstructured/file_utils/filetype.py", line 582, in wrapper elements = func(*args, *kwargs) File "/root/.local/lib/python3.10/site-packages/unstructured/chunking/dispatch.py", line 83, in wrapper elements = func(args, **kwargs) File "/root/.local/lib/python3.10/site-packages/unstructured/partition/docx.py", line 233, in partition_docx return list(elements) File "/root/.local/lib/python3.10/site-packages/unstructured/partition/lang.py", line 397, in apply_lang_metadata elements = list(elements) File "/root/.local/lib/python3.10/site-packages/unstructured/partition/docx.py", line 314, in _iter_document_elements yield from self._iter_paragraph_elements(block_item) File "/root/.local/lib/python3.10/site-packages/unstructured/partition/docx.py", line 555, in _iter_paragraph_elements yield from self._classify_paragraph_to_element(item) File "/root/.local/lib/python3.10/site-packages/unstructured/partition/docx.py", line 375, in _classify_paragraph_to_element TextSubCls = self._parse_paragraph_text_for_element_type(paragraph) File "/root/.local/lib/python3.10/site-packages/unstructured/partition/docx.py", line 884, in _parse_paragraph_text_for_element_type if is_possible_narrative_text(text): File "/root/.local/lib/python3.10/site-packages/unstructured/partition/text_type.py", line 78, in is_possible_narrative_text if exceeds_cap_ratio(text, threshold=cap_threshold): File "/root/.local/lib/python3.10/site-packages/unstructured/partition/text_type.py", line 274, in exceeds_cap_ratio if sentence_count(text, 3) > 1: File "/root/.local/lib/python3.10/site-packages/unstructured/partition/text_type.py", line 223, in sentence_count sentences = sent_tokenize(text) File "/root/.local/lib/python3.10/site-packages/unstructured/nlp/tokenize.py", line 29, in sent_tokenize _download_nltk_package_if_not_present(package_category="tokenizers", package_name="punkt") File "/root/.local/lib/python3.10/site-packages/unstructured/nlp/tokenize.py", line 23, in _download_nltk_package_if_not_present nltk.download(package_name) File "/root/.local/lib/python3.10/site-packages/nltk/downloader.py", line 777, in download for msg in self.incr_download(info_or_id, download_dir, force): File "/root/.local/lib/python3.10/site-packages/nltk/downloader.py", line 629, in incr_download info = self._info_or_id(info_or_id) File "/root/.local/lib/python3.10/site-packages/nltk/downloader.py", line 603, in _info_or_id return self.info(info_or_id) File "/root/.local/lib/python3.10/site-packages/nltk/downloader.py", line 1009, in info self._update_index() File "/root/.local/lib/python3.10/site-packages/nltk/downloader.py", line 952, in _update_index ElementTree.parse(urlopen(self._url)).getroot() File "/usr/local/lib/python3.10/xml/etree/ElementTree.py", line 1222, in parse tree.parse(source, parser) File "/usr/local/lib/python3.10/xml/etree/ElementTree.py", line 580, in parse self._root = parser._parse_whole(source) xml.etree.ElementTree.ParseError: unclosed token: line 39, column 4 Traceback (most recent call last): File "/app/modules/models/base_model.py", line 411, in handle_file_upload construct_index(self.api_key, file_src=files) File "/app/modules/index_func.py", line 134, in construct_index raise Exception(i18n("没有找到任何支持的文档。")) Exception: No supported documents found. ` 时间:2024/5/3 我很好奇是什么原因导致的?