Traceback (most recent call last):
File "/root/miniconda3/envs/torch_nlp/lib/python3.8/site-packages/transformers/configuration_utils.py", line 594, in _get_config_dict
resolved_config_file = cached_path(
File "/root/miniconda3/envs/torch_nlp/lib/python3.8/site-packages/transformers/file_utils.py", line 1936, in cached_path
raise EnvironmentError(f"file {url_or_filename} not found")
OSError: file hf_models/uie-char-small/config.json not found
During handling of the above exception, another exception occurred:
Traceback (most recent call last):
File "run_uie_finetune.py", line 521, in <module>
main()
File "run_uie_finetune.py", line 153, in main
config = AutoConfig.from_pretrained(
File "/root/miniconda3/envs/torch_nlp/lib/python3.8/site-packages/transformers/models/auto/configuration_auto.py", line 637, in from_pretrained
config_dict, _ = PretrainedConfig.get_config_dict(pretrained_model_name_or_path, **kwargs)
File "/root/miniconda3/envs/torch_nlp/lib/python3.8/site-packages/transformers/configuration_utils.py", line 546, in get_config_dict
config_dict, kwargs = cls._get_config_dict(pretrained_model_name_or_path, **kwargs)
File "/root/miniconda3/envs/torch_nlp/lib/python3.8/site-packages/transformers/configuration_utils.py", line 630, in _get_config_dict
raise EnvironmentError(
OSError: Can't load config for 'hf_models/uie-char-small'. If you were trying to load it from 'https://huggingface.co/models', make sure you don't have a local directory with the same name. Otherwise, make sure 'hf_models/uie-char-small' is the correct path to a directory containing a config.json file
您好,感谢您得开源工作。我在学习UIE时,尝试跑通(https://github.com/universal-ie/UIE/blob/main/docs/example_of_weibo_entity.md)这个例子。按照步骤解压数据集并放到指定位置。模型是从分享的连接下载的。在UIE下创建hf_models,然后将压缩包放在里面并解压。运行示例会报错无法加载模型。不知道是为啥,期待您的解惑。