Describe the bug
想在本地使用抽象意义表示功能,但按照教程部署时出现错误:在调用hanlp.load加载模型时出现错误。
amr = hanlp.load('MRP2020_AMR_ENG_ZHO_XLM_BASE')
Code to reproduce the issue
Provide a reproducible test case that is the bare minimum necessary to generate the problem.
import hanlp
amr = hanlp.load('MRP2020_AMR_ENG_ZHO_XLM_BASE')
Describe the current behavior
在linux里运行后出现如下错误,我的网络没问题,之后又加载了其他的模型都没问题,就只有amr模型加载出了问题
A clear and concise description of what happened.
Failed to load http://download.hanlp.com/amr/extra/amr-eng-zho-xlm-roberta-base_20220412_223756.zip
If the problem still persists, please submit an issue to https://github.com/hankcs/HanLP/issues
When reporting an issue, make sure to paste the FULL ERROR LOG below.
================================ERROR LOG BEGINS================================
OS: Linux-5.19.0-32-generic-x86_64-with-glibc2.35
Python: 3.10.13
PyTorch: 2.0.0
HanLP: 2.1.0-beta.57
Traceback (most recent call last):
File "", line 1, in
File "/home/why/miniconda3/envs/HanLP/lib/python3.10/site-packages/hanlp/init.py", line 43, in load
return load_from_meta_file(save_dir, 'meta.json', verbose=verbose, kwargs)
File "/home/why/miniconda3/envs/HanLP/lib/python3.10/site-packages/hanlp/utils/component_util.py", line 186, in load_from_meta_file
raise e from None
File "/home/why/miniconda3/envs/HanLP/lib/python3.10/site-packages/hanlp/utils/component_util.py", line 106, in load_from_meta_file
obj.load(save_dir, verbose=verbose, kwargs)
File "/home/why/miniconda3/envs/HanLP/lib/python3.10/site-packages/hanlp/common/torch_component.py", line 173, in load
self.load_config(save_dir, kwargs)
File "/home/why/miniconda3/envs/HanLP/lib/python3.10/site-packages/perin_parser/perin_parser.py", line 338, in load_config
super().load_config(save_dir, filename, kwargs)
File "/home/why/miniconda3/envs/HanLP/lib/python3.10/site-packages/hanlp/common/torch_component.py", line 126, in load_config
self.on_config_ready(self.config, save_dir=save_dir)
File "/home/why/miniconda3/envs/HanLP/lib/python3.10/site-packages/perin_parser/perin_parser.py", line 320, in on_config_ready
self._tokenizer = AutoTokenizer.from_pretrained(encoder)
File "/home/why/miniconda3/envs/HanLP/lib/python3.10/site-packages/transformers/models/auto/tokenization_auto.py", line 782, in from_pretrained
config = AutoConfig.from_pretrained(
File "/home/why/miniconda3/envs/HanLP/lib/python3.10/site-packages/transformers/models/auto/configuration_auto.py", line 1111, in from_pretrained
config_dict, unused_kwargs = PretrainedConfig.get_config_dict(pretrained_model_name_or_path, kwargs)
File "/home/why/miniconda3/envs/HanLP/lib/python3.10/site-packages/transformers/configuration_utils.py", line 633, in get_config_dict
config_dict, kwargs = cls._get_config_dict(pretrained_model_name_or_path, kwargs)
File "/home/why/miniconda3/envs/HanLP/lib/python3.10/site-packages/transformers/configuration_utils.py", line 688, in _get_config_dict
resolved_config_file = cached_file(
File "/home/why/miniconda3/envs/HanLP/lib/python3.10/site-packages/transformers/utils/hub.py", line 441, in cached_file
raise EnvironmentError(
OSError: We couldn't connect to 'https://huggingface.co' to load this file, couldn't find it in the cached files and it looks like xlm-roberta-base is not the path to a directory containing a file named config.json.
Checkout your internet connection or see how to run the library in offline mode at 'https://huggingface.co/docs/transformers/installation#offline-mode'.
Expected behavior**
A clear and concise description of what you expected to happen.
System information
OS: Linux-5.19.0-32-generic-x86_64-with-glibc2.35
Python: 3.10.13
PyTorch: 2.0.0
HanLP: 2.1.0-beta.57
Other info / logs
Include any logs or source code that would be helpful to diagnose the problem. If including tracebacks, please include the full traceback. Large logs and files should be attached.
[x] I've completed this form and searched the web for solutions.
Describe the bug 想在本地使用抽象意义表示功能,但按照教程部署时出现错误:在调用hanlp.load加载模型时出现错误。 amr = hanlp.load('MRP2020_AMR_ENG_ZHO_XLM_BASE')
Code to reproduce the issue Provide a reproducible test case that is the bare minimum necessary to generate the problem.
Describe the current behavior 在linux里运行后出现如下错误,我的网络没问题,之后又加载了其他的模型都没问题,就只有amr模型加载出了问题 A clear and concise description of what happened. Failed to load http://download.hanlp.com/amr/extra/amr-eng-zho-xlm-roberta-base_20220412_223756.zip If the problem still persists, please submit an issue to https://github.com/hankcs/HanLP/issues When reporting an issue, make sure to paste the FULL ERROR LOG below. ================================ERROR LOG BEGINS================================ OS: Linux-5.19.0-32-generic-x86_64-with-glibc2.35 Python: 3.10.13 PyTorch: 2.0.0 HanLP: 2.1.0-beta.57 Traceback (most recent call last): File "", line 1, in
File "/home/why/miniconda3/envs/HanLP/lib/python3.10/site-packages/hanlp/init.py", line 43, in load
return load_from_meta_file(save_dir, 'meta.json', verbose=verbose, kwargs)
File "/home/why/miniconda3/envs/HanLP/lib/python3.10/site-packages/hanlp/utils/component_util.py", line 186, in load_from_meta_file
raise e from None
File "/home/why/miniconda3/envs/HanLP/lib/python3.10/site-packages/hanlp/utils/component_util.py", line 106, in load_from_meta_file
obj.load(save_dir, verbose=verbose, kwargs)
File "/home/why/miniconda3/envs/HanLP/lib/python3.10/site-packages/hanlp/common/torch_component.py", line 173, in load
self.load_config(save_dir, kwargs)
File "/home/why/miniconda3/envs/HanLP/lib/python3.10/site-packages/perin_parser/perin_parser.py", line 338, in load_config
super().load_config(save_dir, filename, kwargs)
File "/home/why/miniconda3/envs/HanLP/lib/python3.10/site-packages/hanlp/common/torch_component.py", line 126, in load_config
self.on_config_ready(self.config, save_dir=save_dir)
File "/home/why/miniconda3/envs/HanLP/lib/python3.10/site-packages/perin_parser/perin_parser.py", line 320, in on_config_ready
self._tokenizer = AutoTokenizer.from_pretrained(encoder)
File "/home/why/miniconda3/envs/HanLP/lib/python3.10/site-packages/transformers/models/auto/tokenization_auto.py", line 782, in from_pretrained
config = AutoConfig.from_pretrained(
File "/home/why/miniconda3/envs/HanLP/lib/python3.10/site-packages/transformers/models/auto/configuration_auto.py", line 1111, in from_pretrained
config_dict, unused_kwargs = PretrainedConfig.get_config_dict(pretrained_model_name_or_path, kwargs)
File "/home/why/miniconda3/envs/HanLP/lib/python3.10/site-packages/transformers/configuration_utils.py", line 633, in get_config_dict
config_dict, kwargs = cls._get_config_dict(pretrained_model_name_or_path, kwargs)
File "/home/why/miniconda3/envs/HanLP/lib/python3.10/site-packages/transformers/configuration_utils.py", line 688, in _get_config_dict
resolved_config_file = cached_file(
File "/home/why/miniconda3/envs/HanLP/lib/python3.10/site-packages/transformers/utils/hub.py", line 441, in cached_file
raise EnvironmentError(
OSError: We couldn't connect to 'https://huggingface.co' to load this file, couldn't find it in the cached files and it looks like xlm-roberta-base is not the path to a directory containing a file named config.json.
Checkout your internet connection or see how to run the library in offline mode at 'https://huggingface.co/docs/transformers/installation#offline-mode'.
Expected behavior**
A clear and concise description of what you expected to happen.
System information OS: Linux-5.19.0-32-generic-x86_64-with-glibc2.35 Python: 3.10.13 PyTorch: 2.0.0 HanLP: 2.1.0-beta.57
Other info / logs Include any logs or source code that would be helpful to diagnose the problem. If including tracebacks, please include the full traceback. Large logs and files should be attached.