sinhat98 / adapter-wavlm

MIT License
39 stars 8 forks source link

What is this problem and how to solve it #4

Closed Minspeech closed 1 year ago

Minspeech commented 1 year ago

Traceback (most recent call last): File "/root/anaconda3/envs/ladapter/lib/python3.8/site-packages/transformers/configuration_utils.py", line 601, in _get_config_dict resolved_config_file = cached_path( File "/root/anaconda3/envs/ladapter/lib/python3.8/site-packages/transformers/utils/hub.py", line 284, in cached_path output_path = get_from_cache( File "/root/anaconda3/envs/ladapter/lib/python3.8/site-packages/transformers/utils/hub.py", line 554, in get_from_cache raise ValueError( ValueError: Connection error, and we cannot find the requested files in the cached path. Please try again or make sure your Internet connection is on.

During handling of the above exception, another exception occurred:

Traceback (most recent call last): File "train.py", line 313, in main() File "train.py", line 172, in main model = AdaWavLMForCTC.from_pretrained(pretrained_model, model_config) File "/root/anaconda3/envs/ladapter/lib/python3.8/site-packages/transformers/modeling_utils.py", line 1922, in from_pretrained config, model_kwargs = cls.config_class.from_pretrained( File "/root/anaconda3/envs/ladapter/lib/python3.8/site-packages/transformers/configuration_utils.py", line 526, in from_pretrained config_dict, kwargs = cls.get_config_dict(pretrained_model_name_or_path, kwargs) File "/root/anaconda3/envs/ladapter/lib/python3.8/site-packages/transformers/configuration_utils.py", line 553, in get_config_dict config_dict, kwargs = cls._get_config_dict(pretrained_model_name_or_path, **kwargs) File "/root/anaconda3/envs/ladapter/lib/python3.8/site-packages/transformers/configuration_utils.py", line 634, in _get_config_dict raise EnvironmentError( OSError: We couldn't connect to 'https://huggingface.co' to load this model, couldn't find it in the cached files and it looks like microsoft/wavlm-base-plus is not the path to a directory containing a config.json file. Checkout your internet connection or see how to run the library in offline mode at 'https://huggingface.co/docs/transformers/installation#offline-mode'.