eric-ai-lab / Aerial-Vision-and-Dialog-Navigation

Codebase of ACL 2023 Findings "Aerial Vision-and-Dialog Navigation"
https://sites.google.com/view/aerial-vision-and-dialog/home
39 stars 6 forks source link

在训练的时候遇到了网络连接错误该怎么解决呢OSError: We couldn't connect to 'https://huggingface.co' to load this model, couldn't find it in the cached files and it looks like bert-base-uncased is not the path to a directory containing a config.json file. #4

Closed Celinewxy closed 6 months ago

Celinewxy commented 1 year ago

Downloading: 100%|███████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 28.0/28.0 [00:00<00:00, 32.6kB/s] Traceback (most recent call last): File "/root/miniconda3/lib/python3.8/site-packages/transformers/configuration_utils.py", line 601, in _get_config_dict resolved_config_file = cached_path( File "/root/miniconda3/lib/python3.8/site-packages/transformers/utils/hub.py", line 283, in cached_path output_path = get_from_cache( File "/root/miniconda3/lib/python3.8/site-packages/transformers/utils/hub.py", line 553, in get_from_cache raise ValueError( ValueError: Connection error, and we cannot find the requested files in the cached path. Please try again or make sure your Internet connection is on.

During handling of the above exception, another exception occurred:

Traceback (most recent call last): File "xview_et/main.py", line 314, in main() File "xview_et/main.py", line 305, in main train_env, train_full_traj_env, val_envs, val_full_traj_envs = build_dataset(args, rank=rank) File "xview_et/main.py", line 30, in build_dataset tok = get_tokenizer(args) File "xview_et/main.py", line 26, in get_tokenizer tokenizer = AutoTokenizer.from_pretrained(cfg_name) File "/root/miniconda3/lib/python3.8/site-packages/transformers/models/auto/tokenization_auto.py", line 511, in from_pretrained config = AutoConfig.from_pretrained( File "/root/miniconda3/lib/python3.8/site-packages/transformers/models/auto/configuration_auto.py", line 680, in from_pretrained configdict, = PretrainedConfig.get_config_dict(pretrained_model_name_or_path, kwargs) File "/root/miniconda3/lib/python3.8/site-packages/transformers/configuration_utils.py", line 553, in get_config_dict config_dict, kwargs = cls._get_config_dict(pretrained_model_name_or_path, kwargs) File "/root/miniconda3/lib/python3.8/site-packages/transformers/configuration_utils.py", line 634, in _get_config_dict raise EnvironmentError( OSError: We couldn't connect to 'https://huggingface.co' to load this model, couldn't find it in the cached files and it looks like bert-base-uncased is not the path to a directory containing a config.json file. Checkout your internet connection or see how to run the library in offline mode at 'https://huggingface.co/docs/transformers/installation#offline-mode'.

yifeisu commented 1 year ago

Hello, I often encounter the issue you mentioned. However, this seems to be a problem with the transformers library. I resolved it using the following steps:

  1. Ensure that the versions of thetransformers library in multiple conda virtual environments are as consistent as possible, especially when they are all downloading weights to the same cache directory.
  2. Repeat the process until the download is successful.