intel-analytics / ipex-llm

Accelerate local LLM inference and finetuning (LLaMA, Mistral, ChatGLM, Qwen, Mixtral, Gemma, Phi, MiniCPM, Qwen-VL, MiniCPM-V, etc.) on Intel XPU (e.g., local PC with iGPU and NPU, discrete GPU such as Arc, Flex and Max); seamlessly integrate with llama.cpp, Ollama, HuggingFace, LangChain, LlamaIndex, vLLM, GraphRAG, DeepSpeed, Axolotl, etc
Apache License 2.0
6.72k stars 1.26k forks source link

Compatibility issues between transformers and accelerate versions. #11907

Closed brownplayer closed 2 months ago

brownplayer commented 2 months ago

A critical issue occurs during the open-webui installation, and "raise RuntimeError()" is displayed. raise RuntimeError( RuntimeError: Failed to import transformers.trainer because of the following error (look up to see its traceback): cannot import name 'is_mlu_available' from 'accelerate.utils' (D:\ipex-llm-demo\miniforge\envs\llm-cpp\Lib\site-packages\accelerate\utils__init__.py) What should I do if the accelerate version is incompatible?

The following is version information for both modules: (llm-cpp) D:\ipex-llm-demo\demo\ipex-llm\open-webui-main\backend>pip show transformers accelerate Name: transformers Version: 4.44.2 Summary: State-of-the-art Machine Learning for JAX, PyTorch and TensorFlow Home-page: https://github.com/huggingface/transformers Author: The Hugging Face team (past and future) with the help of all our contributors (https://github.com/huggingface/transformers/graphs/contributors) Author-email: transformers@huggingface.co License: Apache 2.0 License Location: D:\ipex-llm-demo\miniforge\envs\llm-cpp\Lib\site-packages Requires: filelock, huggingface-hub, numpy, packaging, pyyaml, regex, requests, safetensors, tokenizers, tqdm Required-by: bigdl-core-cpp, sentence-transformers

Name: accelerate Version: 0.21.0 Summary: Accelerate Home-page: https://github.com/huggingface/accelerate Author: The HuggingFace team Author-email: sylvain@huggingface.co License: Apache Location: D:\ipex-llm-demo\miniforge\envs\llm-cpp\Lib\site-packages Requires: numpy, packaging, psutil, pyyaml, torch Required-by: bigdl-core-cpp

JinheTang commented 2 months ago

Hi @brownplayer, we will take a look to see if we can reproduce the issue first. If there is any progress, we will update here to let you know.

brownplayer commented 2 months ago

OK,thank you

---- Replied Message ---- | From | @.> | | Date | 08/26/2024 16:31 | | To | intel-analytics/ipex-llm @.> | | Cc | brownplayer @.>, Mention @.> | | Subject | Re: [intel-analytics/ipex-llm] Compatibility issues between transformers and accelerate versions. (Issue #11907) |

Hi @brownplayer, we will take a look at this request and try to reproduce the issue. We will update here to let you know.

— Reply to this email directly, view it on GitHub, or unsubscribe. You are receiving this because you were mentioned.Message ID: @.***>

JinheTang commented 2 months ago

Hi @brownplayer , it seems like your accelerate version is inconsistent with your transformers version. Running pip install --pre --upgrade accelerate shall work.

brownplayer commented 2 months ago

Hi @brownplayer , it seems like your version is inconsistent with your version. Running shall work.accelerate``transformers``pip install --pre --upgrade accelerate

I have upgraded the accelerate and transformers, but there are new mistakes image Enter set no_proxy=localhost,127.0.0.1 start_windows.bat Displays: image

JinheTang commented 2 months ago

Hi @brownplayer , it seems like your version is inconsistent with your version. Running shall work.acceleratetransformerspip install --pre --upgrade accelerate

I have upgraded the accelerate and transformers, but there are new mistakes image Enter set no_proxy=localhost,127.0.0.1 start_windows.bat Displays: image

The link in your OSError seems a bit odd. Could you provide your full terminal command input and the full log of the output error?

brownplayer commented 2 months ago

您好,您的版本似乎与您的版本不一致。运行 shall work.transformersaccelerate``pip install --pre --upgrade accelerate

我已经升级了加速和转换器,但有新的错误 图像 Enter set no_proxy=localhost,127.0.0.1 start_windows.bat显示:图像

你中的链接似乎有点奇怪。您能否提供完整的终端命令输入和输出错误的完整日志?OSError

WARNING: CORS_ALLOW_ORIGIN IS SET TO '*' - NOT RECOMMENDED FOR PRODUCTION DEPLOYMENTS.

**_My input command is set no_proxy=localhost,127.0.0.1

startwindows.bat**

My error log is

D:\ipex-llm-demo\miniforge\envs\llm-cpp\Lib\site-packages\pydub\utils.py:170: RuntimeWarning: Couldn't find ffmpeg or avconv - defaulting to ffmpeg, but may not work warn("Couldn't find ffmpeg or avconv - defaulting to ffmpeg, but may not work", RuntimeWarning) USER_AGENT environment variable not set, consider setting it to identify your requests. Cannot determine model snapshot path: Cannot find an appropriate cached snapshot folder for the specified revision on the local disk and outgoing traffic has been disabled. To enable repo look-ups and downloads online, pass 'local_files_only=False' as input. Traceback (most recent call last): File "D:\ipex-llm-demo\open-webui-main\open-webui-main\backend\apps\rag\utils.py", line 353, in get_model_path model_repo_path = snapshot_download(*snapshot_kwargs) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "D:\ipex-llm-demo\miniforge\envs\llm-cpp\Lib\site-packages\huggingface_hub\utils_validators.py", line 114, in _inner_fn return fn(args, **kwargs) ^^^^^^^^^^^^^^^^^^^ File "D:\ipex-llm-demo\miniforge\envs\llm-cpp\Lib\site-packages\huggingface_hub_snapshot_download.py", line 225, in snapshot_download raise LocalEntryNotFoundError( huggingface_hub.utils._errors.LocalEntryNotFoundError: Cannot find an appropriate cached snapshot folder for the specified revision on the local disk and outgoing traffic has been disabled. To enable repo look-ups and downloads online, pass 'local_files_only=False' as input. No sentence-transformers model found with name sentence-transformers/all-MiniLM-L6-v2. Creating a new one with mean pooling. Traceback (most recent call last): File "D:\ipex-llm-demo\miniforge\envs\llm-cpp\Lib\site-packages\urllib3\connection.py", line 196, in _new_conn sock = connection.create_connection( ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "D:\ipex-llm-demo\miniforge\envs\llm-cpp\Lib\site-packages\urllib3\util\connection.py", line 60, in create_connection for res in socket.getaddrinfo(host, port, family, socket.SOCK_STREAM): ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "D:\ipex-llm-demo\miniforge\envs\llm-cpp\Lib\socket.py", line 962, in getaddrinfo for res in _socket.getaddrinfo(host, port, family, type, proto, flags): ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ socket.gaierror: [Errno 11001] getaddrinfo failed

The above exception was the direct cause of the following exception:

Traceback (most recent call last): File "D:\ipex-llm-demo\miniforge\envs\llm-cpp\Lib\site-packages\urllib3\connectionpool.py", line 789, in urlopen response = self._make_request( ^^^^^^^^^^^^^^^^^^^ File "D:\ipex-llm-demo\miniforge\envs\llm-cpp\Lib\site-packages\urllib3\connectionpool.py", line 490, in _make_request raise new_e File "D:\ipex-llm-demo\miniforge\envs\llm-cpp\Lib\site-packages\urllib3\connectionpool.py", line 466, in _make_request self._validate_conn(conn) File "D:\ipex-llm-demo\miniforge\envs\llm-cpp\Lib\site-packages\urllib3\connectionpool.py", line 1095, in _validate_conn conn.connect() File "D:\ipex-llm-demo\miniforge\envs\llm-cpp\Lib\site-packages\urllib3\connection.py", line 615, in connect self.sock = sock = self._new_conn() ^^^^^^^^^^^^^^^^ File "D:\ipex-llm-demo\miniforge\envs\llm-cpp\Lib\site-packages\urllib3\connection.py", line 203, in _new_conn raise NameResolutionError(self.host, self, e) from e urllib3.exceptions.NameResolutionError: <urllib3.connection.HTTPSConnection object at 0x00000223A4953BD0>: Failed to resolve 'hf-mirror.comstart_windows.bat' ([Errno 11001] getaddrinfo failed)

The above exception was the direct cause of the following exception:

Traceback (most recent call last): File "D:\ipex-llm-demo\miniforge\envs\llm-cpp\Lib\site-packages\requests\adapters.py", line 667, in send resp = conn.urlopen( ^^^^^^^^^^^^^ File "D:\ipex-llm-demo\miniforge\envs\llm-cpp\Lib\site-packages\urllib3\connectionpool.py", line 843, in urlopen retries = retries.increment( ^^^^^^^^^^^^^^^^^^ File "D:\ipex-llm-demo\miniforge\envs\llm-cpp\Lib\site-packages\urllib3\util\retry.py", line 519, in increment raise MaxRetryError(_pool, url, reason) from reason # type: ignore[arg-type] ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ urllib3.exceptions.MaxRetryError: HTTPSConnectionPool(host='hf-mirror.comstart_windows.bat', port=443): Max retries exceeded with url: /sentence-transformers/all-MiniLM-L6-v2/resolve/main/config.json (Caused by NameResolutionError("<urllib3.connection.HTTPSConnection object at 0x00000223A4953BD0>: Failed to resolve 'hf-mirror.comstart_windows.bat' ([Errno 11001] getaddrinfo failed)"))

During handling of the above exception, another exception occurred:

Traceback (most recent call last): File "D:\ipex-llm-demo\miniforge\envs\llm-cpp\Lib\site-packages\huggingface_hub\file_download.py", line 1751, in _get_metadata_or_catch_error metadata = get_hf_file_metadata( ^^^^^^^^^^^^^^^^^^^^^ File "D:\ipex-llm-demo\miniforge\envs\llm-cpp\Lib\site-packages\huggingface_hub\utils_validators.py", line 114, in _inner_fn return fn(args, kwargs) ^^^^^^^^^^^^^^^^^^^ File "D:\ipex-llm-demo\miniforge\envs\llm-cpp\Lib\site-packages\huggingface_hub\file_download.py", line 1673, in get_hf_file_metadata r = _request_wrapper( ^^^^^^^^^^^^^^^^^ File "D:\ipex-llm-demo\miniforge\envs\llm-cpp\Lib\site-packages\huggingface_hub\file_download.py", line 376, in _request_wrapper response = _request_wrapper( ^^^^^^^^^^^^^^^^^ File "D:\ipex-llm-demo\miniforge\envs\llm-cpp\Lib\site-packages\huggingface_hub\file_download.py", line 399, in _request_wrapper response = get_session().request(method=method, url=url, params) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "D:\ipex-llm-demo\miniforge\envs\llm-cpp\Lib\site-packages\requests\sessions.py", line 589, in request resp = self.send(prep, send_kwargs) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "D:\ipex-llm-demo\miniforge\envs\llm-cpp\Lib\site-packages\requests\sessions.py", line 703, in send r = adapter.send(request, kwargs) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "D:\ipex-llm-demo\miniforge\envs\llm-cpp\Lib\site-packages\huggingface_hub\utils_http.py", line 66, in send return super().send(request, args, **kwargs) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "D:\ipex-llm-demo\miniforge\envs\llm-cpp\Lib\site-packages\requests\adapters.py", line 700, in send raise ConnectionError(e, request=request) requests.exceptions.ConnectionError: (MaxRetryError('HTTPSConnectionPool(host=\'hf-mirror.comstart_windows.bat\', port=443): Max retries exceeded with url: /sentence-transformers/all-MiniLM-L6-v2/resolve/main/config.json (Caused by NameResolutionError("<urllib3.connection.HTTPSConnection object at 0x00000223A4953BD0>: Failed to resolve \'hf-mirror.comstart_windows.bat\' ([Errno 11001] getaddrinfo failed)"))'), '(Request ID: bce7d6cb-12bf-46e5-9048-715d0897852f)')

The above exception was the direct cause of the following exception:

Traceback (most recent call last): File "D:\ipex-llm-demo\miniforge\envs\llm-cpp\Lib\site-packages\transformers\utils\hub.py", line 402, in cached_file resolved_file = hf_hub_download( ^^^^^^^^^^^^^^^^ File "D:\ipex-llm-demo\miniforge\envs\llm-cpp\Lib\site-packages\huggingface_hub\utils_deprecation.py", line 101, in inner_f return f(*args, *kwargs) ^^^^^^^^^^^^^^^^^^ File "D:\ipex-llm-demo\miniforge\envs\llm-cpp\Lib\site-packages\huggingface_hub\utils_validators.py", line 114, in _inner_fn return fn(args, **kwargs) ^^^^^^^^^^^^^^^^^^^ File "D:\ipex-llm-demo\miniforge\envs\llm-cpp\Lib\site-packages\huggingface_hub\file_download.py", line 1240, in hf_hub_download return _hf_hub_download_to_cache_dir( ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "D:\ipex-llm-demo\miniforge\envs\llm-cpp\Lib\site-packages\huggingface_hub\file_download.py", line 1347, in _hf_hub_download_to_cache_dir _raise_on_head_call_error(head_call_error, force_download, local_files_only) File "D:\ipex-llm-demo\miniforge\envs\llm-cpp\Lib\site-packages\huggingface_hub\file_download.py", line 1857, in _raise_on_head_call_error raise LocalEntryNotFoundError( huggingface_hub.utils._errors.LocalEntryNotFoundError: An error happened while trying to locate the file on the Hub and we cannot find the requested files in the local cache. Please check your connection and try again or make sure your Internet connection is on.

The above exception was the direct cause of the following exception:

Traceback (most recent call last): File "", line 198, in _run_module_as_main File "", line 88, in _run_code File "D:\ipex-llm-demo\miniforge\envs\llm-cpp\Scripts\uvicorn.exe__main.py", line 7, in File "D:\ipex-llm-demo\miniforge\envs\llm-cpp\Lib\site-packages\click\core.py", line 1157, in call return self.main(*args, kwargs) ^^^^^^^^^^^^^^^^^^^^^^^^^^ File "D:\ipex-llm-demo\miniforge\envs\llm-cpp\Lib\site-packages\click\core.py", line 1078, in main rv = self.invoke(ctx) ^^^^^^^^^^^^^^^^ File "D:\ipex-llm-demo\miniforge\envs\llm-cpp\Lib\site-packages\click\core.py", line 1434, in invoke return ctx.invoke(self.callback, ctx.params) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "D:\ipex-llm-demo\miniforge\envs\llm-cpp\Lib\site-packages\click\core.py", line 783, in invoke return callback(*args, kwargs) ^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "D:\ipex-llm-demo\miniforge\envs\llm-cpp\Lib\site-packages\uvicorn\main.py", line 410, in main run( File "D:\ipex-llm-demo\miniforge\envs\llm-cpp\Lib\site-packages\uvicorn\main.py", line 577, in run server.run() File "D:\ipex-llm-demo\miniforge\envs\llm-cpp\Lib\site-packages\uvicorn\server.py", line 65, in run return asyncio.run(self.serve(sockets=sockets)) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "D:\ipex-llm-demo\miniforge\envs\llm-cpp\Lib\asyncio\runners.py", line 190, in run return runner.run(main) ^^^^^^^^^^^^^^^^ File "D:\ipex-llm-demo\miniforge\envs\llm-cpp\Lib\asyncio\runners.py", line 118, in run return self._loop.run_until_complete(task) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "D:\ipex-llm-demo\miniforge\envs\llm-cpp\Lib\asyncio\base_events.py", line 654, in run_until_complete return future.result() ^^^^^^^^^^^^^^^ File "D:\ipex-llm-demo\miniforge\envs\llm-cpp\Lib\site-packages\uvicorn\server.py", line 69, in serve await self._serve(sockets) File "D:\ipex-llm-demo\miniforge\envs\llm-cpp\Lib\site-packages\uvicorn\server.py", line 76, in _serve config.load() File "D:\ipex-llm-demo\miniforge\envs\llm-cpp\Lib\site-packages\uvicorn\config.py", line 434, in load self.loaded_app = import_from_string(self.app) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "D:\ipex-llm-demo\miniforge\envs\llm-cpp\Lib\site-packages\uvicorn\importer.py", line 19, in import_from_string module = importlib.import_module(module_str) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "D:\ipex-llm-demo\miniforge\envs\llm-cpp\Lib\importlib__init.py", line 126, in import_module return _bootstrap._gcd_import(name[level:], package, level) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "", line 1204, in _gcd_import File "", line 1176, in _find_and_load File "", line 1147, in _find_and_load_unlocked File "", line 690, in _load_unlocked File "", line 940, in exec_module File "", line 241, in _call_with_frames_removed File "D:\ipex-llm-demo\open-webui-main\open-webui-main\backend\main.py", line 45, in from apps.rag.main import app as rag_app File "D:\ipex-llm-demo\open-webui-main\open-webui-main\backend\apps\rag\main.py", line 224, in update_embedding_model( File "D:\ipex-llm-demo\open-webui-main\open-webui-main\backend\apps\rag\main.py", line 199, in update_embedding_model app.state.sentence_transformer_ef = sentence_transformers.SentenceTransformer( ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "D:\ipex-llm-demo\miniforge\envs\llm-cpp\Lib\site-packages\sentence_transformers\SentenceTransformer.py", line 299, in init__ modules = self._load_auto_model( ^^^^^^^^^^^^^^^^^^^^^^ File "D:\ipex-llm-demo\miniforge\envs\llm-cpp\Lib\site-packages\sentence_transformers\SentenceTransformer.py", line 1324, in _load_auto_model transformer_model = Transformer( ^^^^^^^^^^^^ File "D:\ipex-llm-demo\miniforge\envs\llm-cpp\Lib\site-packages\sentence_transformers\models\Transformer.py", line 53, in init config = AutoConfig.from_pretrained(model_name_or_path, config_args, cache_dir=cache_dir) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "D:\ipex-llm-demo\miniforge\envs\llm-cpp\Lib\site-packages\transformers\models\auto\configuration_auto.py", line 976, in from_pretrained config_dict, unused_kwargs = PretrainedConfig.get_config_dict(pretrained_model_name_or_path, kwargs) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "D:\ipex-llm-demo\miniforge\envs\llm-cpp\Lib\site-packages\transformers\configuration_utils.py", line 632, in get_config_dict config_dict, kwargs = cls._get_config_dict(pretrained_model_name_or_path, kwargs) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "D:\ipex-llm-demo\miniforge\envs\llm-cpp\Lib\site-packages\transformers\configuration_utils.py", line 689, in _get_config_dict resolved_config_file = cached_file( ^^^^^^^^^^^^ File "D:\ipex-llm-demo\miniforge\envs\llm-cpp\Lib\site-packages\transformers\utils\hub.py", line 445, in cached_file raise EnvironmentError( OSError: We couldn't connect to 'https://hf-mirror.comstart_windows.bat' to load this file, couldn't find it in the cached files and it looks like sentence-transformers/all-MiniLM-L6-v2 is not the path to a directory containing a file named config.json. Checkout your internet connection or see how to run the library in offline mode at 'https://huggingface.co/docs/transformers/installation#offline-mode'.

brownplayer commented 2 months ago

image

JinheTang commented 2 months ago

您好,您的版本似乎与您的版本不一致。运行 shall work.transformersacceleratepip install --pre --upgrade accelerate ``

我已经升级了加速和转换器,但有新的错误 图像 Enter set no_proxy=localhost,127.0.0.1 start_windows.bat显示:图像

你中的链接似乎有点奇怪。您能否提供完整的终端命令输入和输出错误的完整日志?OSError

WARNING: CORS_ALLOW_ORIGIN IS SET TO '*' - NOT RECOMMENDED FOR PRODUCTION DEPLOYMENTS.

**_My input command is set no_proxy=localhost,127.0.0.1

startwindows.bat**

My error log is

D:\ipex-llm-demo\miniforge\envs\llm-cpp\Lib\site-packages\pydub\utils.py:170: RuntimeWarning: Couldn't find ffmpeg or avconv - defaulting to ffmpeg, but may not work warn("Couldn't find ffmpeg or avconv - defaulting to ffmpeg, but may not work", RuntimeWarning) USER_AGENT environment variable not set, consider setting it to identify your requests. Cannot determine model snapshot path: Cannot find an appropriate cached snapshot folder for the specified revision on the local disk and outgoing traffic has been disabled. To enable repo look-ups and downloads online, pass 'local_files_only=False' as input. Traceback (most recent call last): File "D:\ipex-llm-demo\open-webui-main\open-webui-main\backend\apps\rag\utils.py", line 353, in get_model_path model_repo_path = snapshot_download(*snapshot_kwargs) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "D:\ipex-llm-demo\miniforge\envs\llm-cpp\Lib\site-packages\huggingface_hub\utils_validators.py", line 114, in _inner_fn return fn(args, **kwargs) ^^^^^^^^^^^^^^^^^^^ File "D:\ipex-llm-demo\miniforge\envs\llm-cpp\Lib\site-packages\huggingface_hub_snapshot_download.py", line 225, in snapshot_download raise LocalEntryNotFoundError( huggingface_hub.utils._errors.LocalEntryNotFoundError: Cannot find an appropriate cached snapshot folder for the specified revision on the local disk and outgoing traffic has been disabled. To enable repo look-ups and downloads online, pass 'local_files_only=False' as input. No sentence-transformers model found with name sentence-transformers/all-MiniLM-L6-v2. Creating a new one with mean pooling. Traceback (most recent call last): File "D:\ipex-llm-demo\miniforge\envs\llm-cpp\Lib\site-packages\urllib3\connection.py", line 196, in _new_conn sock = connection.create_connection( ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "D:\ipex-llm-demo\miniforge\envs\llm-cpp\Lib\site-packages\urllib3\util\connection.py", line 60, in create_connection for res in socket.getaddrinfo(host, port, family, socket.SOCK_STREAM): ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "D:\ipex-llm-demo\miniforge\envs\llm-cpp\Lib\socket.py", line 962, in getaddrinfo for res in _socket.getaddrinfo(host, port, family, type, proto, flags): ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ socket.gaierror: [Errno 11001] getaddrinfo failed

The above exception was the direct cause of the following exception:

Traceback (most recent call last): File "D:\ipex-llm-demo\miniforge\envs\llm-cpp\Lib\site-packages\urllib3\connectionpool.py", line 789, in urlopen response = self._make_request( ^^^^^^^^^^^^^^^^^^^ File "D:\ipex-llm-demo\miniforge\envs\llm-cpp\Lib\site-packages\urllib3\connectionpool.py", line 490, in _make_request raise new_e File "D:\ipex-llm-demo\miniforge\envs\llm-cpp\Lib\site-packages\urllib3\connectionpool.py", line 466, in _make_request self._validate_conn(conn) File "D:\ipex-llm-demo\miniforge\envs\llm-cpp\Lib\site-packages\urllib3\connectionpool.py", line 1095, in _validate_conn conn.connect() File "D:\ipex-llm-demo\miniforge\envs\llm-cpp\Lib\site-packages\urllib3\connection.py", line 615, in connect self.sock = sock = self._new_conn() ^^^^^^^^^^^^^^^^ File "D:\ipex-llm-demo\miniforge\envs\llm-cpp\Lib\site-packages\urllib3\connection.py", line 203, in _new_conn raise NameResolutionError(self.host, self, e) from e urllib3.exceptions.NameResolutionError: <urllib3.connection.HTTPSConnection object at 0x00000223A4953BD0>: Failed to resolve 'hf-mirror.comstart_windows.bat' ([Errno 11001] getaddrinfo failed)

The above exception was the direct cause of the following exception:

Traceback (most recent call last): File "D:\ipex-llm-demo\miniforge\envs\llm-cpp\Lib\site-packages\requests\adapters.py", line 667, in send resp = conn.urlopen( ^^^^^^^^^^^^^ File "D:\ipex-llm-demo\miniforge\envs\llm-cpp\Lib\site-packages\urllib3\connectionpool.py", line 843, in urlopen retries = retries.increment( ^^^^^^^^^^^^^^^^^^ File "D:\ipex-llm-demo\miniforge\envs\llm-cpp\Lib\site-packages\urllib3\util\retry.py", line 519, in increment raise MaxRetryError(_pool, url, reason) from reason # type: ignore[arg-type] ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ urllib3.exceptions.MaxRetryError: HTTPSConnectionPool(host='hf-mirror.comstart_windows.bat', port=443): Max retries exceeded with url: /sentence-transformers/all-MiniLM-L6-v2/resolve/main/config.json (Caused by NameResolutionError("<urllib3.connection.HTTPSConnection object at 0x00000223A4953BD0>: Failed to resolve 'hf-mirror.comstart_windows.bat' ([Errno 11001] getaddrinfo failed)"))

During handling of the above exception, another exception occurred:

Traceback (most recent call last): File "D:\ipex-llm-demo\miniforge\envs\llm-cpp\Lib\site-packages\huggingface_hub\file_download.py", line 1751, in _get_metadata_or_catch_error metadata = get_hf_file_metadata( ^^^^^^^^^^^^^^^^^^^^^ File "D:\ipex-llm-demo\miniforge\envs\llm-cpp\Lib\site-packages\huggingface_hub\utils_validators.py", line 114, in _inner_fn return fn(args, kwargs) ^^^^^^^^^^^^^^^^^^^ File "D:\ipex-llm-demo\miniforge\envs\llm-cpp\Lib\site-packages\huggingface_hub\file_download.py", line 1673, in get_hf_file_metadata r = _request_wrapper( ^^^^^^^^^^^^^^^^^ File "D:\ipex-llm-demo\miniforge\envs\llm-cpp\Lib\site-packages\huggingface_hub\file_download.py", line 376, in _request_wrapper response = _request_wrapper( ^^^^^^^^^^^^^^^^^ File "D:\ipex-llm-demo\miniforge\envs\llm-cpp\Lib\site-packages\huggingface_hub\file_download.py", line 399, in _request_wrapper response = get_session().request(method=method, url=url, params) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "D:\ipex-llm-demo\miniforge\envs\llm-cpp\Lib\site-packages\requests\sessions.py", line 589, in request resp = self.send(prep, send_kwargs) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "D:\ipex-llm-demo\miniforge\envs\llm-cpp\Lib\site-packages\requests\sessions.py", line 703, in send r = adapter.send(request, kwargs) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "D:\ipex-llm-demo\miniforge\envs\llm-cpp\Lib\site-packages\huggingface_hub\utils_http.py", line 66, in send return super().send(request, args, **kwargs) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "D:\ipex-llm-demo\miniforge\envs\llm-cpp\Lib\site-packages\requests\adapters.py", line 700, in send raise ConnectionError(e, request=request) requests.exceptions.ConnectionError: (MaxRetryError('HTTPSConnectionPool(host='hf-mirror.comstart_windows.bat', port=443): Max retries exceeded with url: /sentence-transformers/all-MiniLM-L6-v2/resolve/main/config.json (Caused by NameResolutionError("<urllib3.connection.HTTPSConnection object at 0x00000223A4953BD0>: Failed to resolve 'hf-mirror.comstart_windows.bat' ([Errno 11001] getaddrinfo failed)"))'), '(Request ID: bce7d6cb-12bf-46e5-9048-715d0897852f)')

The above exception was the direct cause of the following exception:

Traceback (most recent call last): File "D:\ipex-llm-demo\miniforge\envs\llm-cpp\Lib\site-packages\transformers\utils\hub.py", line 402, in cached_file resolved_file = hf_hub_download( ^^^^^^^^^^^^^^^^ File "D:\ipex-llm-demo\miniforge\envs\llm-cpp\Lib\site-packages\huggingface_hub\utils_deprecation.py", line 101, in inner_f return f(*args, *kwargs) ^^^^^^^^^^^^^^^^^^ File "D:\ipex-llm-demo\miniforge\envs\llm-cpp\Lib\site-packages\huggingface_hub\utils_validators.py", line 114, in _inner_fn return fn(args, **kwargs) ^^^^^^^^^^^^^^^^^^^ File "D:\ipex-llm-demo\miniforge\envs\llm-cpp\Lib\site-packages\huggingface_hub\file_download.py", line 1240, in hf_hub_download return _hf_hub_download_to_cache_dir( ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "D:\ipex-llm-demo\miniforge\envs\llm-cpp\Lib\site-packages\huggingface_hub\file_download.py", line 1347, in _hf_hub_download_to_cache_dir _raise_on_head_call_error(head_call_error, force_download, local_files_only) File "D:\ipex-llm-demo\miniforge\envs\llm-cpp\Lib\site-packages\huggingface_hub\file_download.py", line 1857, in _raise_on_head_call_error raise LocalEntryNotFoundError( huggingface_hub.utils._errors.LocalEntryNotFoundError: An error happened while trying to locate the file on the Hub and we cannot find the requested files in the local cache. Please check your connection and try again or make sure your Internet connection is on.

The above exception was the direct cause of the following exception:

Traceback (most recent call last): File "", line 198, in _run_module_as_main File "", line 88, in _run_code File "D:\ipex-llm-demo\miniforge\envs\llm-cpp\Scripts\uvicorn.exemain.py", line 7, in File "D:\ipex-llm-demo\miniforge\envs\llm-cpp\Lib\site-packages\click\core.py", line 1157, in call return self.main(args, kwargs) ^^^^^^^^^^^^^^^^^^^^^^^^^^ File "D:\ipex-llm-demo\miniforge\envs\llm-cpp\Lib\site-packages\click\core.py", line 1078, in main rv = self.invoke(ctx) ^^^^^^^^^^^^^^^^ File "D:\ipex-llm-demo\miniforge\envs\llm-cpp\Lib\site-packages\click\core.py", line 1434, in invoke return ctx.invoke(self.callback, ctx.params) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "D:\ipex-llm-demo\miniforge\envs\llm-cpp\Lib\site-packages\click\core.py", line 783, in invoke return __callback(args, kwargs) ^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "D:\ipex-llm-demo\miniforge\envs\llm-cpp\Lib\site-packages\uvicorn\main.py", line 410, in main run( File "D:\ipex-llm-demo\miniforge\envs\llm-cpp\Lib\site-packages\uvicorn\main.py", line 577, in run server.run() File "D:\ipex-llm-demo\miniforge\envs\llm-cpp\Lib\site-packages\uvicorn\server.py", line 65, in run return asyncio.run(self.serve(sockets=sockets)) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "D:\ipex-llm-demo\miniforge\envs\llm-cpp\Lib\asyncio\runners.py", line 190, in run return runner.run(main) ^^^^^^^^^^^^^^^^ File "D:\ipex-llm-demo\miniforge\envs\llm-cpp\Lib\asyncio\runners.py", line 118, in run return self._loop.run_until_complete(task) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "D:\ipex-llm-demo\miniforge\envs\llm-cpp\Lib\asyncio\base_events.py", line 654, in run_until_complete return future.result() ^^^^^^^^^^^^^^^ File "D:\ipex-llm-demo\miniforge\envs\llm-cpp\Lib\site-packages\uvicorn\server.py", line 69, in serve await self._serve(sockets) File "D:\ipex-llm-demo\miniforge\envs\llm-cpp\Lib\site-packages\uvicorn\server.py", line 76, in _serve config.load() File "D:\ipex-llm-demo\miniforge\envs\llm-cpp\Lib\site-packages\uvicorn\config.py", line 434, in load self.loaded_app = import_from_string(self.app) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "D:\ipex-llm-demo\miniforge\envs\llm-cpp\Lib\site-packages\uvicorn\importer.py", line 19, in import_from_string module = importlib.import_module(module_str) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "D:\ipex-llm-demo\miniforge\envs\llm-cpp\Lib\importlibinit.py", line 126, in import_module return _bootstrap._gcd_import(name[level:], package, level) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "", line 1204, in _gcd_import File "", line 1176, in _find_and_load File "", line 1147, in _find_and_load_unlocked File "", line 690, in _load_unlocked File "", line 940, in exec_module File "", line 241, in _call_with_frames_removed File "D:\ipex-llm-demo\open-webui-main\open-webui-main\backend\main.py", line 45, in from apps.rag.main import app as rag_app File "D:\ipex-llm-demo\open-webui-main\open-webui-main\backend\apps\rag\main.py", line 224, in update_embedding_model( File "D:\ipex-llm-demo\open-webui-main\open-webui-main\backend\apps\rag\main.py", line 199, in update_embedding_model app.state.sentence_transformer_ef = sentence_transformers.SentenceTransformer( ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "D:\ipex-llm-demo\miniforge\envs\llm-cpp\Lib\site-packages\sentence_transformers\SentenceTransformer.py", line 299, in init modules = self._load_auto_model( ^^^^^^^^^^^^^^^^^^^^^^ File "D:\ipex-llm-demo\miniforge\envs\llm-cpp\Lib\site-packages\sentence_transformers\SentenceTransformer.py", line 1324, in _load_auto_model transformer_model = Transformer( ^^^^^^^^^^^^ File "D:\ipex-llm-demo\miniforge\envs\llm-cpp\Lib\site-packages\sentence_transformers\models\Transformer.py", line 53, in init config = AutoConfig.from_pretrained(model_name_or_path, config_args, cache_dir=cache_dir) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "D:\ipex-llm-demo\miniforge\envs\llm-cpp\Lib\site-packages\transformers\models\auto\configuration_auto.py", line 976, in from_pretrained config_dict, unused_kwargs = PretrainedConfig.get_config_dict(pretrained_model_name_or_path, kwargs) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "D:\ipex-llm-demo\miniforge\envs\llm-cpp\Lib\site-packages\transformers\configuration_utils.py", line 632, in get_config_dict config_dict, kwargs = cls._get_config_dict(pretrained_model_name_or_path, kwargs) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "D:\ipex-llm-demo\miniforge\envs\llm-cpp\Lib\site-packages\transformers\configuration_utils.py", line 689, in _get_config_dict resolved_config_file = cached_file( ^^^^^^^^^^^^ File "D:\ipex-llm-demo\miniforge\envs\llm-cpp\Lib\site-packages\transformers\utils\hub.py", line 445, in cached_file raise EnvironmentError( OSError: We couldn't connect to 'https://hf-mirror.comstart_windows.bat' to load this file, couldn't find it in the cached files and it looks like sentence-transformers/all-MiniLM-L6-v2 is not the path to a directory containing a file named config.json. Checkout your internet connection or see how to run the library in offline mode at 'https://huggingface.co/docs/transformers/installation#offline-mode'.

We didn't manage to reproduce this issue. Maybe you can check your proxy and internet connections. Did you run set HF_ENDPOINT=https://hf-mirror.com before start_windows.bat? The command line input that works for me:

set HF_ENDPOINT=https://hf-mirror.com
set no_proxy=localhost,127.0.0.1
start_windows.bat

After this, it is expected to show this in the terminal:

image

brownplayer commented 2 months ago

I succeeded after I ran set HF_ENDPOINT=https://hf-mirror.com

brownplayer commented 2 months ago

Thank you for your contributions

brownplayer commented 2 months ago

I will close this issue