chatchat-space / Langchain-Chatchat

Langchain-Chatchat(原Langchain-ChatGLM)基于 Langchain 与 ChatGLM, Qwen 与 Llama 等语言模型的 RAG 与 Agent 应用 | Langchain-Chatchat (formerly langchain-ChatGLM), local knowledge based LLM (like ChatGLM, Qwen and Llama) RAG and Agent app with langchain
Apache License 2.0
30.85k stars 5.39k forks source link

如何使加载rerank模型时默认使用trust_remote_code=true #4080

Closed WSC741606 closed 3 months ago

WSC741606 commented 3 months ago

@dosu-bot 我想使用新的rerank模型,加载时报错如下

/data/home/user/Test/Langchain-Chatchat-Env/lib/python3.9/site-packages/huggingface_hub/file_download.py:1132: FutureWarning: `resume_download` is deprecated and will be removed in version 1.0.0. Downloads always resume when possible. If you want to force a new download, use `force_download=True`.
  warnings.warn(
ERROR:    Exception in ASGI application
Traceback (most recent call last):
  File "/data/home/user/Test/Langchain-Chatchat-Env/lib/python3.9/site-packages/sse_starlette/sse.py", line 269, in __call__
    await wrap(partial(self.listen_for_disconnect, receive))
  File "/data/home/user/Test/Langchain-Chatchat-Env/lib/python3.9/site-packages/sse_starlette/sse.py", line 258, in wrap
    await func()
  File "/data/home/user/Test/Langchain-Chatchat-Env/lib/python3.9/site-packages/sse_starlette/sse.py", line 215, in listen_for_disconnect
    message = await receive()
  File "/data/home/user/Test/Langchain-Chatchat-Env/lib/python3.9/site-packages/uvicorn/protocols/http/httptools_impl.py", line 568, in receive
    await self.message_event.wait()
  File "/usr/local/lib/python3.9/asyncio/locks.py", line 226, in wait
    await fut
asyncio.exceptions.CancelledError: Cancelled by cancel scope 2ada8143f970

During handling of the above exception, another exception occurred:

  + Exception Group Traceback (most recent call last):
  |   File "/data/home/user/Test/Langchain-Chatchat-Env/lib/python3.9/site-packages/uvicorn/protocols/http/httptools_impl.py", line 411, in run_asgi
  |     result = await app(  # type: ignore[func-returns-value]
  |   File "/data/home/user/Test/Langchain-Chatchat-Env/lib/python3.9/site-packages/uvicorn/middleware/proxy_headers.py", line 69, in __call__
  |     return await self.app(scope, receive, send)
  |   File "/data/home/user/Test/Langchain-Chatchat-Env/lib/python3.9/site-packages/fastapi/applications.py", line 1054, in __call__
  |     await super().__call__(scope, receive, send)
  |   File "/data/home/user/Test/Langchain-Chatchat-Env/lib/python3.9/site-packages/starlette/applications.py", line 119, in __call__
  |     await self.middleware_stack(scope, receive, send)
  |   File "/data/home/user/Test/Langchain-Chatchat-Env/lib/python3.9/site-packages/starlette/middleware/errors.py", line 186, in __call__
  |     raise exc
  |   File "/data/home/user/Test/Langchain-Chatchat-Env/lib/python3.9/site-packages/starlette/middleware/errors.py", line 164, in __call__
  |     await self.app(scope, receive, _send)
  |   File "/data/home/user/Test/Langchain-Chatchat-Env/lib/python3.9/site-packages/starlette/middleware/exceptions.py", line 62, in __call__
  |     await wrap_app_handling_exceptions(self.app, conn)(scope, receive, send)
  |   File "/data/home/user/Test/Langchain-Chatchat-Env/lib/python3.9/site-packages/starlette/_exception_handler.py", line 64, in wrapped_app
  |     raise exc
  |   File "/data/home/user/Test/Langchain-Chatchat-Env/lib/python3.9/site-packages/starlette/_exception_handler.py", line 53, in wrapped_app
  |     await app(scope, receive, sender)
  |   File "/data/home/user/Test/Langchain-Chatchat-Env/lib/python3.9/site-packages/starlette/routing.py", line 762, in __call__
  |     await self.middleware_stack(scope, receive, send)
  |   File "/data/home/user/Test/Langchain-Chatchat-Env/lib/python3.9/site-packages/starlette/routing.py", line 782, in app
  |     await route.handle(scope, receive, send)
  |   File "/data/home/user/Test/Langchain-Chatchat-Env/lib/python3.9/site-packages/starlette/routing.py", line 297, in handle
  |     await self.app(scope, receive, send)
  |   File "/data/home/user/Test/Langchain-Chatchat-Env/lib/python3.9/site-packages/starlette/routing.py", line 77, in app
  |     await wrap_app_handling_exceptions(app, request)(scope, receive, send)
  |   File "/data/home/user/Test/Langchain-Chatchat-Env/lib/python3.9/site-packages/starlette/_exception_handler.py", line 64, in wrapped_app
  |     raise exc
  |   File "/data/home/user/Test/Langchain-Chatchat-Env/lib/python3.9/site-packages/starlette/_exception_handler.py", line 53, in wrapped_app
  |     await app(scope, receive, sender)
  |   File "/data/home/user/Test/Langchain-Chatchat-Env/lib/python3.9/site-packages/starlette/routing.py", line 75, in app
  |     await response(scope, receive, send)
  |   File "/data/home/user/Test/Langchain-Chatchat-Env/lib/python3.9/site-packages/sse_starlette/sse.py", line 269, in __call__
  |     await wrap(partial(self.listen_for_disconnect, receive))
  |   File "/data/home/user/Test/Langchain-Chatchat-Env/lib/python3.9/site-packages/anyio/_backends/_asyncio.py", line 678, in __aexit__
  |     raise BaseExceptionGroup(
  | exceptiongroup.ExceptionGroup: unhandled errors in a TaskGroup (1 sub-exception)
  +-+---------------- 1 ----------------
    | Traceback (most recent call last):
    |   File "/data/home/user/Test/Langchain-Chatchat-Env/lib/python3.9/site-packages/sse_starlette/sse.py", line 258, in wrap
    |     await func()
    |   File "/data/home/user/Test/Langchain-Chatchat-Env/lib/python3.9/site-packages/sse_starlette/sse.py", line 245, in stream_response
    |     async for data in self.body_iterator:
    |   File "/data/home/user/Test/GitLibrary/Langchain-Chatchat/server/chat/knowledge_base_chat.py", line 88, in knowledge_base_chat_iterator
    |     reranker_model = LangchainReranker(top_n=top_k,
    |   File "/data/home/user/Test/GitLibrary/Langchain-Chatchat/server/reranker/reranker.py", line 49, in __init__
    |     self._model = CrossEncoder(model_name=model_name_or_path, max_length=1024, device=device)
    |   File "/data/home/user/Test/Langchain-Chatchat-Env/lib/python3.9/site-packages/sentence_transformers/cross_encoder/CrossEncoder.py", line 66, in __init__
    |     self.config = AutoConfig.from_pretrained(model_name, trust_remote_code=trust_remote_code, revision=revision)
    |   File "/data/home/user/Test/Langchain-Chatchat-Env/lib/python3.9/site-packages/transformers/models/auto/configuration_auto.py", line 937, in from_pretrained
    |     trust_remote_code = resolve_trust_remote_code(
    |   File "/data/home/user/Test/Langchain-Chatchat-Env/lib/python3.9/site-packages/transformers/dynamic_module_utils.py", line 636, in resolve_trust_remote_code
    |     raise ValueError(
    | ValueError: Loading BAAI/bge-reranker-v2-minicpm-layerwise requires you to execute the configuration file in that repo on your local machine. Make sure you have read the code there to avoid malicious use, then set the option `trust_remote_code=True` to remove this error.

我能怎么配置全局变量使其默认允许trust_remote_code=True吗?

WSC741606 commented 3 months ago

额。。。dosu bot挂了啊