GaiZhenbiao / ChuanhuChatGPT

GUI for ChatGPT API and many LLMs. Supports agents, file-based QA, GPT finetuning and query with web search. All with a neat UI.
https://huggingface.co/spaces/JohnSmith9982/ChuanhuChatGPT
GNU General Public License v3.0
15.29k stars 2.29k forks source link

[其他]: Hugging Face 中复制空间报错 #746

Closed mctds closed 1 year ago

mctds commented 1 year ago

是否已存在现有反馈与解答?

错误描述

Hugging Face 中复制空间报错

复现操作

点击“Duplicate this Space‘之后运行报错

错误日志

===== Build Queued at 2023-05-12 04:41:02 / Commit SHA: 7616055 =====

--> FROM docker.io/library/python:3.8.9@sha256:49d05fff9cb3b185b15ffd92d8e6bd61c20aa916133dca2e3dbe0215270faf53
DONE 0.0s

--> RUN pip install --no-cache-dir pip==22.3.1 &&     pip install --no-cache-dir         datasets         "huggingface-hub>=0.12.1" "protobuf<4" "click<8.1"
CACHED

--> RUN sed -i 's http://deb.debian.org http://cdn-aws.deb.debian.org g' /etc/apt/sources.list && sed -i 's http://archive.ubuntu.com http://us-east-1.ec2.archive.ubuntu.com g' /etc/apt/sources.list && sed -i '/security/d' /etc/apt/sources.list && apt-get update && apt-get install -y    git     git-lfs     ffmpeg  libsm6  libxext6    cmake   libgl1-mesa-glx     && rm -rf /var/lib/apt/lists/*  && git lfs install
CACHED

--> RUN pip install --no-cache-dir         gradio==3.25.0
CACHED

--> RUN --mount=target=pre-requirements.txt,source=pre-requirements.txt     pip install --no-cache-dir -r pre-requirements.txt
CACHED

--> COPY --link --chown=1000 ./ /home/user/app
CACHED

--> COPY --link --chown=1000 --from=lfs /app /home/user/app
CACHED

--> RUN --mount=target=requirements.txt,source=requirements.txt     pip install --no-cache-dir -r requirements.txt
CACHED

--> WORKDIR /home/user/app
CACHED

--> RUN useradd -m -u 1000 user
CACHED

--> RUN --mount=target=/root/packages.txt,source=packages.txt   sed -i 's http://deb.debian.org http://cdn-aws.deb.debian.org g' /etc/apt/sources.list && sed -i 's http://archive.ubuntu.com http://us-east-1.ec2.archive.ubuntu.com g' /etc/apt/sources.list && sed -i '/security/d' /etc/apt/sources.list && apt-get update &&     xargs -r -a /root/packages.txt apt-get install -y     && rm -rf /var/lib/apt/lists/*
CACHED

--> Pushing image
DONE 1.0s

--> Exporting cache
DONE 1.1s

===== Application Startup at 2023-05-12 04:41:12 =====

Traceback (most recent call last):
  File "ChuanhuChatbot.py", line 12, in <module>
    from modules.overwrites import *
  File "/home/user/app/modules/overwrites.py", line 4, in <module>
    from llama_index import Prompt
  File "/home/user/.local/lib/python3.8/site-packages/llama_index/__init__.py", line 18, in <module>
    from llama_index.indices.common.struct_store.base import SQLDocumentContextBuilder
  File "/home/user/.local/lib/python3.8/site-packages/llama_index/indices/__init__.py", line 4, in <module>
    from llama_index.indices.keyword_table.base import GPTKeywordTableIndex
  File "/home/user/.local/lib/python3.8/site-packages/llama_index/indices/keyword_table/__init__.py", line 4, in <module>
    from llama_index.indices.keyword_table.base import GPTKeywordTableIndex
  File "/home/user/.local/lib/python3.8/site-packages/llama_index/indices/keyword_table/base.py", line 17, in <module>
    from llama_index.indices.base import BaseGPTIndex, QueryMap
  File "/home/user/.local/lib/python3.8/site-packages/llama_index/indices/base.py", line 11, in <module>
    from llama_index.indices.query.base import BaseGPTIndexQuery
  File "/home/user/.local/lib/python3.8/site-packages/llama_index/indices/query/base.py", line 21, in <module>
    from llama_index.indices.postprocessor.node import (
  File "/home/user/.local/lib/python3.8/site-packages/llama_index/indices/postprocessor/__init__.py", line 5, in <module>
    from llama_index.indices.postprocessor.node import (
  File "/home/user/.local/lib/python3.8/site-packages/llama_index/indices/postprocessor/node.py", line 11, in <module>
    from llama_index.indices.service_context import ServiceContext
  File "/home/user/.local/lib/python3.8/site-packages/llama_index/indices/service_context.py", line 6, in <module>
    from llama_index.indices.prompt_helper import PromptHelper
  File "/home/user/.local/lib/python3.8/site-packages/llama_index/indices/prompt_helper.py", line 12, in <module>
    from llama_index.langchain_helpers.chain_wrapper import LLMPredictor
  File "/home/user/.local/lib/python3.8/site-packages/llama_index/langchain_helpers/chain_wrapper.py", line 6, in <module>
    from llama_index.llm_predictor.base import (  # noqa: F401
  File "/home/user/.local/lib/python3.8/site-packages/llama_index/llm_predictor/__init__.py", line 4, in <module>
    from llama_index.llm_predictor.base import LLMPredictor
  File "/home/user/.local/lib/python3.8/site-packages/llama_index/llm_predictor/base.py", line 12, in <module>
    from langchain.schema import BaseLanguageModel
ImportError: cannot import name 'BaseLanguageModel' from 'langchain.schema' (/home/user/.local/lib/python3.8/site-packages/langchain/schema.py)

运行环境

补充说明

No response

GaiZhenbiao commented 1 year ago

问题已经解决。原因是新版langchain引入了一个breaking change。