Closed dbbice closed 1 year ago
Hi @dbbice,
Thank you for your attention.
This is obviously an internet connection issue. The code will donwload the bert model on huggingface hub before training. Please try this snippet on your server to verify you can reach huggingface hub.
from transformers import AutoConfig
vocab_size = AutoConfig.from_pretrained('bert-base-uncased').vocab_size
python3 main.py -dataset WN18RR \
During handling of the above exception, another exception occurred:
Traceback (most recent call last): File "main.py", line 176, in
configs.vocab_size = AutoConfig.from_pretrained(configs.pretrained_model).vocab_size
File "/home/wsco/anaconda3/envs/env_DRE/lib/python3.6/site-packages/transformers/models/auto/configuration_auto.py", line 652, in from_pretrained
configdict, = PretrainedConfig.get_config_dict(pretrained_model_name_or_path, kwargs)
File "/home/wsco/anaconda3/envs/env_DRE/lib/python3.6/site-packages/transformers/configuration_utils.py", line 548, in get_config_dict
config_dict, kwargs = cls._get_config_dict(pretrained_model_name_or_path, kwargs)
File "/home/wsco/anaconda3/envs/env_DRE/lib/python3.6/site-packages/transformers/configuration_utils.py", line 630, in _get_config_dict
f"We couldn't connect to '{HUGGINGFACE_CO_RESOLVE_ENDPOINT}' to load this model, couldn't find it in the cached "
OSError: We couldn't connect to 'https://huggingface.co' to load this model, couldn't find it in the cached files and it looks like bert-large-uncased is not the path to a directory containing a {configuration_file} file.
Checkout your internet connection or see how to run the library in offline mode at 'https://huggingface.co/docs/transformers/installation#offline-mode'.
—————————————————— 您好~请问这个问题应该怎么解决