Stability-AI / generative-models

Generative Models by Stability AI
MIT License
23.18k stars 2.56k forks source link

huggingface_hub.utils._errors.LocalEntryNotFoundError during `state = init_st(version_dict, load_filter=True)` (line:120 at scripts/demo/video_sampling.py) #152

Closed haodong2000 closed 7 months ago

haodong2000 commented 7 months ago

I've manually downloaded svd.safetensors from huggingface, but while loading the svd model, an errored raised at state = init_st(version_dict, load_filter=True) (line:120 of scripts/demo/video_sampling.py)

Full log Collecting usage statistics. To deactivate, set browser.gatherUsageStats to False. You can now view your Streamlit app in your browser. Network URL: http://10.120.16.9:8005 External URL: http://103.189.154.10:8005 VideoTransformerBlock is using checkpointing VideoTransformerBlock is using checkpointing VideoTransformerBlock is using checkpointing VideoTransformerBlock is using checkpointing VideoTransformerBlock is using checkpointing VideoTransformerBlock is using checkpointing VideoTransformerBlock is using checkpointing VideoTransformerBlock is using checkpointing VideoTransformerBlock is using checkpointing VideoTransformerBlock is using checkpointing VideoTransformerBlock is using checkpointing VideoTransformerBlock is using checkpointing VideoTransformerBlock is using checkpointing VideoTransformerBlock is using checkpointing VideoTransformerBlock is using checkpointing VideoTransformerBlock is using checkpointing 2023-11-22 09:21:27.165 Uncaught app exception Traceback (most recent call last): File "/root/miniconda3/envs/generative_models/lib/python3.10/site-packages/streamlit/runtime/caching/cache_utils.py", line 264, in _get_or_create_cached_value cached_result = cache.read_result(value_key) File "/root/miniconda3/envs/generative_models/lib/python3.10/site-packages/streamlit/runtime/caching/cache_resource_api.py", line 500, in read_result raise CacheKeyNotFoundError() streamlit.runtime.caching.cache_errors.CacheKeyNotFoundError During handling of the above exception, another exception occurred: Traceback (most recent call last): File "/root/miniconda3/envs/generative_models/lib/python3.10/site-packages/streamlit/runtime/caching/cache_utils.py", line 312, in _handle_cache_miss cached_result = cache.read_result(value_key) File "/root/miniconda3/envs/generative_models/lib/python3.10/site-packages/streamlit/runtime/caching/cache_resource_api.py", line 500, in read_result raise CacheKeyNotFoundError() streamlit.runtime.caching.cache_errors.CacheKeyNotFoundError During handling of the above exception, another exception occurred: Traceback (most recent call last): File "/root/miniconda3/envs/generative_models/lib/python3.10/site-packages/urllib3/connectionpool.py", line 715, in urlopen httplib_response = self._make_request( File "/root/miniconda3/envs/generative_models/lib/python3.10/site-packages/urllib3/connectionpool.py", line 404, in _make_request self._validate_conn(conn) File "/root/miniconda3/envs/generative_models/lib/python3.10/site-packages/urllib3/connectionpool.py", line 1058, in _validate_conn conn.connect() File "/root/miniconda3/envs/generative_models/lib/python3.10/site-packages/urllib3/connection.py", line 419, in connect self.sock = ssl_wrap_socket( File "/root/miniconda3/envs/generative_models/lib/python3.10/site-packages/urllib3/util/ssl_.py", line 449, in ssl_wrap_socket ssl_sock = _ssl_wrap_socket_impl( File "/root/miniconda3/envs/generative_models/lib/python3.10/site-packages/urllib3/util/ssl_.py", line 493, in _ssl_wrap_socket_impl return ssl_context.wrap_socket(sock, server_hostname=server_hostname) File "/root/miniconda3/envs/generative_models/lib/python3.10/ssl.py", line 513, in wrap_socket return self.sslsocket_class._create( File "/root/miniconda3/envs/generative_models/lib/python3.10/ssl.py", line 1104, in _create self.do_handshake() File "/root/miniconda3/envs/generative_models/lib/python3.10/ssl.py", line 1375, in do_handshake self._sslobj.do_handshake() ConnectionResetError: [Errno 104] Connection reset by peer During handling of the above exception, another exception occurred: Traceback (most recent call last): File "/root/miniconda3/envs/generative_models/lib/python3.10/site-packages/requests/adapters.py", line 486, in send resp = conn.urlopen( File "/root/miniconda3/envs/generative_models/lib/python3.10/site-packages/urllib3/connectionpool.py", line 799, in urlopen retries = retries.increment( File "/root/miniconda3/envs/generative_models/lib/python3.10/site-packages/urllib3/util/retry.py", line 550, in increment raise six.reraise(type(error), error, _stacktrace) File "/root/miniconda3/envs/generative_models/lib/python3.10/site-packages/urllib3/packages/six.py", line 769, in reraise raise value.with_traceback(tb) File "/root/miniconda3/envs/generative_models/lib/python3.10/site-packages/urllib3/connectionpool.py", line 715, in urlopen httplib_response = self._make_request( File "/root/miniconda3/envs/generative_models/lib/python3.10/site-packages/urllib3/connectionpool.py", line 404, in _make_request self._validate_conn(conn) File "/root/miniconda3/envs/generative_models/lib/python3.10/site-packages/urllib3/connectionpool.py", line 1058, in _validate_conn conn.connect() File "/root/miniconda3/envs/generative_models/lib/python3.10/site-packages/urllib3/connection.py", line 419, in connect self.sock = ssl_wrap_socket( File "/root/miniconda3/envs/generative_models/lib/python3.10/site-packages/urllib3/util/ssl_.py", line 449, in ssl_wrap_socket ssl_sock = _ssl_wrap_socket_impl( File "/root/miniconda3/envs/generative_models/lib/python3.10/site-packages/urllib3/util/ssl_.py", line 493, in _ssl_wrap_socket_impl return ssl_context.wrap_socket(sock, server_hostname=server_hostname) File "/root/miniconda3/envs/generative_models/lib/python3.10/ssl.py", line 513, in wrap_socket return self.sslsocket_class._create( File "/root/miniconda3/envs/generative_models/lib/python3.10/ssl.py", line 1104, in _create self.do_handshake() File "/root/miniconda3/envs/generative_models/lib/python3.10/ssl.py", line 1375, in do_handshake self._sslobj.do_handshake() urllib3.exceptions.ProtocolError: ('Connection aborted.', ConnectionResetError(104, 'Connection reset by peer')) During handling of the above exception, another exception occurred: Traceback (most recent call last): File "/root/miniconda3/envs/generative_models/lib/python3.10/site-packages/huggingface_hub/file_download.py", line 1247, in hf_hub_download metadata = get_hf_file_metadata( File "/root/miniconda3/envs/generative_models/lib/python3.10/site-packages/huggingface_hub/utils/_validators.py", line 118, in _inner_fn return fn(*args, **kwargs) File "/root/miniconda3/envs/generative_models/lib/python3.10/site-packages/huggingface_hub/file_download.py", line 1624, in get_hf_file_metadata r = _request_wrapper( File "/root/miniconda3/envs/generative_models/lib/python3.10/site-packages/huggingface_hub/file_download.py", line 402, in _request_wrapper response = _request_wrapper( File "/root/miniconda3/envs/generative_models/lib/python3.10/site-packages/huggingface_hub/file_download.py", line 425, in _request_wrapper response = get_session().request(method=method, url=url, **params) File "/root/miniconda3/envs/generative_models/lib/python3.10/site-packages/requests/sessions.py", line 589, in request resp = self.send(prep, **send_kwargs) File "/root/miniconda3/envs/generative_models/lib/python3.10/site-packages/requests/sessions.py", line 703, in send r = adapter.send(request, **kwargs) File "/root/miniconda3/envs/generative_models/lib/python3.10/site-packages/huggingface_hub/utils/_http.py", line 63, in send return super().send(request, *args, **kwargs) File "/root/miniconda3/envs/generative_models/lib/python3.10/site-packages/requests/adapters.py", line 501, in send raise ConnectionError(err, request=request) requests.exceptions.ConnectionError: (ProtocolError('Connection aborted.', ConnectionResetError(104, 'Connection reset by peer')), '(Request ID: 711a3a32-4980-4a60-bb66-5d5a4e0e12b7)') The above exception was the direct cause of the following exception: Traceback (most recent call last): File "/root/miniconda3/envs/generative_models/lib/python3.10/site-packages/streamlit/runtime/scriptrunner/script_runner.py", line 534, in _run_script exec(code, module.__dict__) File "/home/haodongli/generative-models/scripts/demo/video_sampling.py", line 121, in state = init_st(version_dict, load_filter=True) File "/root/miniconda3/envs/generative_models/lib/python3.10/site-packages/streamlit/runtime/caching/cache_utils.py", line 212, in wrapper return cached_func(*args, **kwargs) File "/root/miniconda3/envs/generative_models/lib/python3.10/site-packages/streamlit/runtime/caching/cache_utils.py", line 241, in __call__ return self._get_or_create_cached_value(args, kwargs) File "/root/miniconda3/envs/generative_models/lib/python3.10/site-packages/streamlit/runtime/caching/cache_utils.py", line 267, in _get_or_create_cached_value return self._handle_cache_miss(cache, value_key, func_args, func_kwargs) File "/root/miniconda3/envs/generative_models/lib/python3.10/site-packages/streamlit/runtime/caching/cache_utils.py", line 321, in _handle_cache_miss computed_value = self._info.func(*func_args, **func_kwargs) File "/home/haodongli/generative-models/scripts/demo/streamlit_helpers.py", line 46, in init_st model, msg = load_model_from_config(config, ckpt if load_ckpt else None) File "/home/haodongli/generative-models/scripts/demo/streamlit_helpers.py", line 86, in load_model_from_config model = instantiate_from_config(config.model) File "/home/haodongli/generative-models/sgm/util.py", line 175, in instantiate_from_config return get_obj_from_str(config["target"])(**config.get("params", dict())) File "/home/haodongli/generative-models/sgm/models/diffusion.py", line 59, in __init__ self.conditioner = instantiate_from_config( File "/home/haodongli/generative-models/sgm/util.py", line 175, in instantiate_from_config return get_obj_from_str(config["target"])(**config.get("params", dict())) File "/home/haodongli/generative-models/sgm/modules/encoders/modules.py", line 79, in __init__ embedder = instantiate_from_config(embconfig) File "/home/haodongli/generative-models/sgm/util.py", line 175, in instantiate_from_config return get_obj_from_str(config["target"])(**config.get("params", dict())) File "/home/haodongli/generative-models/sgm/modules/encoders/modules.py", line 1039, in __init__ self.open_clip = instantiate_from_config(open_clip_embedding_config) File "/home/haodongli/generative-models/sgm/util.py", line 175, in instantiate_from_config return get_obj_from_str(config["target"])(**config.get("params", dict())) File "/home/haodongli/generative-models/sgm/modules/encoders/modules.py", line 591, in __init__ model, _, _ = open_clip.create_model_and_transforms( File "/root/miniconda3/envs/generative_models/lib/python3.10/site-packages/open_clip/factory.py", line 382, in create_model_and_transforms model = create_model( File "/root/miniconda3/envs/generative_models/lib/python3.10/site-packages/open_clip/factory.py", line 281, in create_model checkpoint_path = download_pretrained(pretrained_cfg, cache_dir=cache_dir) File "/root/miniconda3/envs/generative_models/lib/python3.10/site-packages/open_clip/pretrained.py", line 552, in download_pretrained target = download_pretrained_from_hf(model_id, cache_dir=cache_dir) File "/root/miniconda3/envs/generative_models/lib/python3.10/site-packages/open_clip/pretrained.py", line 522, in download_pretrained_from_hf cached_file = hf_hub_download(model_id, filename, revision=revision, cache_dir=cache_dir) File "/root/miniconda3/envs/generative_models/lib/python3.10/site-packages/huggingface_hub/utils/_validators.py", line 118, in _inner_fn return fn(*args, **kwargs) File "/root/miniconda3/envs/generative_models/lib/python3.10/site-packages/huggingface_hub/file_download.py", line 1377, in hf_hub_download raise LocalEntryNotFoundError( huggingface_hub.utils._errors.LocalEntryNotFoundError: An error happened while trying to locate the file on the Hub and we cannot find the requested files in the local cache. Please check your connection and try again or make sure your Internet connection is on.

Thanks for any help!

Brewnut-98 commented 7 months ago

我有类似的问题,国内墙了huggingface,我是要用clip但不知道该放哪里

I have same problem, becaues of loss connection with huggingface,i know need open_clip but don't know where to put

timmal commented 7 months ago

I have same problem, becaues of loss connection with huggingface,i know need open_clip but don't know where to put

In a Colab this issue was solved by adding more RAM.

Cyclones-Y commented 7 months ago

我有类似的问题,国内墙了huggingface,我是要用clip但不知道该放哪里

I have same problem, becaues of loss connection with huggingface,i know need open_clip but don't know where to put

您好,我遇到了跟您一样的问题,请问您现在解决了吗?我是在云服务器上运行的

haodong2000 commented 7 months ago

just manually load the clip after line279 at /root/miniconda3/envs/generative_models/lib/python3.10/site-packages/open_clip/factory.py

            if pretrained == 'laion2b_s32b_b79k' and model_name == 'ViT-H-14':
                pretrained = 'your_path/CLIP-ViT-H-14-laion2B-s32B-b79K/open_clip_pytorch_model.bin'
                print(f'==> load {model_name}, version: {pretrained}')
                pretrained_cfg = get_pretrained_cfg(model_name, pretrained)
                print(f'==> pretrained_cfg = {pretrained_cfg}')