Closed ChiuHsin closed 1 year ago
遇到同样的问题,似乎所有依存句法分析的模型都用不了。
Meet the same question and can not use all dep models.
Read your log and do what huggingface already told you:
Checkout your internet connection or see how to run the library in offline mode at 'https://huggingface.co/docs/transformers/installation#offline-mode'.
Describe the bug ================================ERROR LOG BEGINS================================ OS: Windows-10-10.0.19041-SP0 Python: 3.8.11 PyTorch: 1.9.1+cpu HanLP: 2.1.0-beta.50
OSError Traceback (most recent call last) d:\PycharmProjects\mednlp_project\mednlp_train\mishu_entity_extract.ipynb 单元格 64 in () 1 import hanlp 3 HanLP = hanlp.pipeline() \ 4 .append(hanlp.utils.rules.split_sentence, output_key='sentences') \ 5 .append(hanlp.load('FINE_ELECTRA_SMALL_ZH'), output_key='tok') \ 6 .append(hanlp.load('CTB9_POS_ELECTRA_SMALL'), output_key='pos') \ 7 .append(hanlp.load('MSRA_NER_ELECTRA_SMALL_ZH'), output_key='ner', input_key='tok') \ ----> 8 .append(hanlp.load('PMT1_DEP_ELECTRA_SMALL', conll=0), output_key='dep', input_key='tok')\ 9 .append(hanlp.load('CTB9_CON_ELECTRA_SMALL'), output_key='con', input_key='tok')
File d:\ProgramData\Anaconda3\envs\mednlp\lib\site-packages\hanlp__init__.py:43, in load(save_dir, verbose, kwargs) 41 from hanlp_common.constant import HANLP_VERBOSE 42 verbose = HANLP_VERBOSE ---> 43 return load_from_meta_file(save_dir, 'meta.json', verbose=verbose, kwargs)
File d:\ProgramData\Anaconda3\envs\mednlp\lib\site-packages\hanlp\utils\component_util.py:186, in load_from_meta_file(save_dir, meta_filename, transform_only, verbose, **kwargs) 184 except: 185 pass --> 186 raise e from None
File d:\ProgramData\Anaconda3\envs\mednlp\lib\site-packages\hanlp\utils\component_util.py:106, in load_from_meta_file(save_dir, meta_filename, transform_only, verbose, **kwargs) 104 else: 105 if os.path.isfile(os.path.join(save_dir, 'config.json')): ... 459 if not _raise_exceptions_for_missing_entries:
OSError: We couldn't connect to 'https://huggingface.co' to load this file, couldn't find it in the cached files and it looks like hfl/chinese-electra-180g-small-discriminator is not the path to a directory containing a file named config.json. Checkout your internet connection or see how to run the library in offline mode at 'https://huggingface.co/docs/transformers/installation#offline-mode'. =================================ERROR LOG ENDS================================= Code to reproduce the issue Provide a reproducible test case that is the bare minimum necessary to generate the problem.
Describe the current behavior A clear and concise description of what happened.
Expected behavior A clear and concise description of what you expected to happen.
System information
Other info / logs Include any logs or source code that would be helpful to diagnose the problem. If including tracebacks, please include the full traceback. Large logs and files should be attached.