tzwm / sd-scripts

some scripts for stable-diffusion-webui in the autodl
Other
86 stars 17 forks source link

autoDL运行comfyUI的segment anything报错连不了抱脸 #4

Closed uiueux closed 8 months ago

uiueux commented 9 months ago

update 2: 但是特别不稳定 update 已经解决:提供了学术加速 https://www.autodl.com/docs/network_turbo/

应该是网络问题吧?autoDL默认自带代理了吗?如果autoDL没有代理,应该怎么解决呢?

如下是报错: Error occurred when executing GroundingDinoModelLoader (segment anything):

(MaxRetryError("HTTPSConnectionPool(host='huggingface.co', port=443): Max retries exceeded with url: /bert-base-uncased/resolve/main/tokenizer_config.json (Caused by ProxyError('Cannot connect to proxy.', TimeoutError('_ssl.c:980: The handshake operation timed out')))"), '(Request ID: ecb9a516-6c9d-4cc9-bea4-4a99dd6f0596)')

File "/root/ComfyUI/execution.py", line 152, in recursive_execute output_data, output_ui = get_output_data(obj, input_data_all) File "/root/ComfyUI/execution.py", line 82, in get_output_data return_values = map_node_over_list(obj, input_data_all, obj.FUNCTION, allow_interrupt=True) File "/root/ComfyUI/execution.py", line 75, in map_node_over_list results.append(getattr(obj, func)(slice_dict(input_data_all, i))) File "/root/ComfyUI/custom_nodes/comfyui_segment_anything/node.py", line 286, in main dino_model = load_groundingdino_model(model_name) File "/root/ComfyUI/custom_nodes/comfyui_segment_anything/node.py", line 126, in load_groundingdino_model dino = local_groundingdino_build_model(dino_model_args) File "/root/ComfyUI/custom_nodes/comfyui_segment_anything/local_groundingdino/models/init.py", line 17, in build_model model = build_func(args) File "/root/ComfyUI/custom_nodes/comfyui_segment_anything/local_groundingdino/models/GroundingDINO/groundingdino.py", line 362, in build_groundingdino model = GroundingDINO( File "/root/ComfyUI/custom_nodes/comfyui_segment_anything/local_groundingdino/models/GroundingDINO/groundingdino.py", line 97, in init self.tokenizer = get_tokenlizer.get_tokenlizer(text_encoder_type) File "/root/ComfyUI/custom_nodes/comfyui_segment_anything/local_groundingdino/util/get_tokenlizer.py", line 19, in get_tokenlizer tokenizer = AutoTokenizer.from_pretrained(text_encoder_type) File "/root/miniconda3/lib/python3.10/site-packages/transformers/models/auto/tokenization_auto.py", line 598, in from_pretrained tokenizer_config = get_tokenizer_config(pretrained_model_name_or_path, kwargs) File "/root/miniconda3/lib/python3.10/site-packages/transformers/models/auto/tokenization_auto.py", line 442, in get_tokenizer_config resolved_config_file = cached_file( File "/root/miniconda3/lib/python3.10/site-packages/transformers/utils/hub.py", line 409, in cached_file resolved_file = hf_hub_download( File "/root/miniconda3/lib/python3.10/site-packages/huggingface_hub/utils/_validators.py", line 118, in _inner_fn return fn(*args, kwargs) File "/root/miniconda3/lib/python3.10/site-packages/huggingface_hub/file_download.py", line 1238, in hf_hub_download metadata = get_hf_file_metadata( File "/root/miniconda3/lib/python3.10/site-packages/huggingface_hub/utils/_validators.py", line 118, in _inner_fn return fn(args, kwargs) File "/root/miniconda3/lib/python3.10/site-packages/huggingface_hub/file_download.py", line 1631, in get_hf_file_metadata r = _request_wrapper( File "/root/miniconda3/lib/python3.10/site-packages/huggingface_hub/file_download.py", line 385, in _request_wrapper response = _request_wrapper( File "/root/miniconda3/lib/python3.10/site-packages/huggingface_hub/file_download.py", line 408, in _request_wrapper response = get_session().request(method=method, url=url, params) File "/root/miniconda3/lib/python3.10/site-packages/requests/sessions.py", line 587, in request resp = self.send(prep, send_kwargs) File "/root/miniconda3/lib/python3.10/site-packages/requests/sessions.py", line 701, in send r = adapter.send(request, kwargs) File "/root/miniconda3/lib/python3.10/site-packages/huggingface_hub/utils/_http.py", line 67, in send return super().send(request, args, kwargs) File "/root/miniconda3/lib/python3.10/site-packages/requests/adapters.py", line 559, in send raise ProxyError(e, request=request)

tzwm commented 8 months ago

我默认是开了 autodl 的学术加速的,但学术加速本身高峰期确实有时候不稳定,这我就没办法了。后面看看是不是换成 https://hf-mirror.com/

chwshuang commented 4 months ago

把这3个模型全部按要求安装好:https://github.com/storyicon/comfyui_segment_anything?tab=readme-ov-file#bert-base-uncased