Traceback (most recent call last):
File "/root/raid/electrolyte_papers_extraction/NER/ner.py", line 31, in <module>
ner = NER('ckpt/ner/240731013859.938009/output_best', device = 'gpu')
File "/root/raid/electrolyte_papers_extraction/NER/ner.py", line 12, in __init__
self.pipeline = pipeline(Tasks.named_entity_recognition, abspath(ckpt), device = device)
File "/usr/local/lib/python3.10/dist-packages/modelscope/pipelines/builder.py", line 169, in pipeline
return build_pipeline(cfg, task_name=task)
File "/usr/local/lib/python3.10/dist-packages/modelscope/pipelines/builder.py", line 65, in build_pipeline
return build_from_cfg(
File "/usr/local/lib/python3.10/dist-packages/modelscope/utils/registry.py", line 215, in build_from_cfg
raise type(e)(f'{obj_cls.__name__}: {e}')
RuntimeError: SequenceLabelingPipeline: SequenceLabelingModel: TransformerEmbedder: Try loading from huggingface and modelscope failed
huggingface:
The request model: google-bert/bert-base-cased does not exist!
modelscope:
The request model: google-bert/bert-base-cased does not exist!
What is your question?
error message:
self trained NER checkpoint:
https://github.com/breadbread1984/electrolyte_papers_extraction/tree/main/NER/ckpt/ner/240731013859.938009
What have you tried?
under ckpt/ner/. edit <path/to/latest/checkpoint>/output_best/configuration.json to change the following lines
from
to
Code (if necessary)
source code:
What's your environment?
Code of Conduct